Finetuning Mask-RCNN on the egohands dataset

Weights - https://drive.google.com/open?id=1gzG0UTi6uhw7OgP_6ZXq7H9CqdBu2xdt

First, download the dataset from - http://vision.soic.indiana.edu/projects/egohands/

Extract it and you will find a bunch of matlab files and images.

You need to run these Matlab files in order to get the labels.

However I created a simple script which would do all of this and store our labels, masks and boxes in a folder which PyTorch can then consume.

So, first set up Matlab and then run getData.m which will do all the necessary preprocessing.

Once this is done, you can proceed with this notebook.

In [2]:
import os
import random
import time
import csv

import numpy as np

import torch
import torchvision

from PIL import Image
import cv2


%matplotlib inline
import matplotlib.pyplot as plt
from matplotlib.patches import Rectangle
from IPython import display


from torch.autograd import Variable
from torchvision.models.detection.mask_rcnn import MaskRCNNPredictor
from torchvision.models.detection.faster_rcnn import FastRCNNPredictor

import transforms as T

from engine import train_one_epoch, evaluate
import utils

%load_ext autoreload
%autoreload 2

# from torchvision import transforms
# transform = transforms.Compose([transforms.ToTensor()])  # Convert image to PyTorch Tensor
device = torch.device('cuda') if torch.cuda.is_available() else torch.device('cpu')
/home/rm5310/miniconda3/envs/vision_ml/lib/python3.5/site-packages/pycocotools/coco.py:49: UserWarning: matplotlib.pyplot as already been imported, this call will have no effect.
  import matplotlib; matplotlib.use('Agg')
In [2]:
!ls egohands/DATA_IMAGES/ | head -10 
Image10_100.jpg
Image10_10.jpg
Image10_11.jpg
Image10_12.jpg
Image10_13.jpg
Image10_14.jpg
Image10_15.jpg
Image10_16.jpg
Image10_17.jpg
Image10_18.jpg
ls: write error: Broken pipe
In [3]:
 !ls egohands/DATA_MASKS/ | head -10
Mask10_100.jpg
Mask10_10.jpg
Mask10_11.jpg
Mask10_12.jpg
Mask10_13.jpg
Mask10_14.jpg
Mask10_15.jpg
Mask10_16.jpg
Mask10_17.jpg
Mask10_18.jpg
ls: write error: Broken pipe

Lets visualize some boxes and masks from our dataset.

(This needs the images from the dataset to be downloaded) This was just used for me to understand the dataset so its not necessary to run this.

In [3]:
i=2
j=3
mask_path = "./egohands/DATA_MASKS/Mask"+str(i)+"_"+str(j)+".jpg"
box_csv = "./egohands/DATA_BOXES/Box"+str(i)+"_"+str(j)+".csv"
mask = Image.open(mask_path)
mask = np.array(mask)
plt.imshow(mask)
with open(box_csv, 'r') as f:
    reader = csv.reader(f)
    boxes = list(reader)
final_boxes = []
for box in boxes:
    x = int(box[0])
    y = int(box[1])
    width = int(box[2])
    height = int(box[3])
    if width == 0 or height == 0:
        continue
    else:
        final_boxes.append([x,y,width,height])
        plt.gca().add_patch(Rectangle((x,y),width,height,linewidth=1,edgecolor='r',facecolor='none'))
print(final_boxes) 
plt.show()
[[398, 686, 93, 34], [673, 425, 332, 295], [612, 267, 132, 101], [476, 262, 126, 104]]

Now, we create the dataset compatible with PyTorch from our directory.

In [5]:
class HandsDataset(object):
    def __init__(self, root, transforms):
        self.root = root
        self.transforms = transforms
        self.imgs = list(sorted(os.listdir(os.path.join(root, "DATA_IMAGES/"))))
        self.masks = list(sorted(os.listdir(os.path.join(root, "DATA_MASKS/"))))
        self.boxes = list(sorted(os.listdir(os.path.join(root, "DATA_BOXES/"))))

    def __getitem__(self, idx):
        # load images ad masks
        img_path = os.path.join(self.root, "DATA_IMAGES/", self.imgs[idx])
        mask_path = os.path.join(self.root, "DATA_MASKS/", self.masks[idx])
        box_path = os.path.join(self.root, "DATA_BOXES/", self.boxes[idx])
        img = Image.open(img_path).convert("RGB")
        mask = Image.open(mask_path)
#         plt.imshow(mask)
        
        mask = np.array(mask)
        with open(box_path, 'r') as f:
            reader = csv.reader(f)
            boxes = list(reader)    
        final_boxes = []
        for box in boxes:
            x = int(box[0])
            y = int(box[1])
            width = int(box[2])
            height = int(box[3])
            if x <= 0:
                x+=1
            if y <= 0:
                y+=1
            if x+width == 1280:
                x-=1
            if y+height == 720:
                y-=1    
            if width < 20 or height < 20:
                continue
            elif x + width > 1280 or y + height > 720:
                continue
            else:
                final_boxes.append([x,y,x+width,y+height])
        masks = np.zeros((len(final_boxes),720,1280))
        for fb in range(len(final_boxes)):
            box = final_boxes[fb]
            xmin = box[0]
            xmax = box[2]
            ymin = box[1]
            ymax = box[3]
            
            masks[fb][ymin:ymax,xmin:xmax] = mask[ymin:ymax,xmin:xmax]
            
            masks[fb] = np.where(masks[fb] > 0, 1, 0)
                
        boxes = torch.as_tensor(final_boxes, dtype=torch.float32)
        labels = torch.ones((len(final_boxes),), dtype=torch.int64)
        image_id = torch.tensor([idx])
        masks = torch.as_tensor(masks, dtype=torch.uint8)
        iscrowd = torch.zeros((len(final_boxes),), dtype=torch.int64)
        
        target = {}
        target["boxes"] = boxes
        target["labels"] = labels
        target["masks"] = masks
        target["image_id"] = image_id
        target["iscrowd"] = iscrowd

        if self.transforms is not None:
            try:
                img, target = self.transforms(img, target)
            except:
                print("problem at " + img_path)

        return img, target

    def __len__(self):
        return len(self.imgs)

Load the pretrained MaskRCNN model and change the final number of classes in it.

In [6]:
def get_model_instance_segmentation(num_classes):
    # load an instance segmentation model pre-trained pre-trained on COCO
    model = torchvision.models.detection.maskrcnn_resnet50_fpn(pretrained=True)

    # get number of input features for the classifier
    in_features = model.roi_heads.box_predictor.cls_score.in_features
    # replace the pre-trained head with a new one
    model.roi_heads.box_predictor = FastRCNNPredictor(in_features, num_classes)

    # now get the number of input features for the mask classifier
    in_features_mask = model.roi_heads.mask_predictor.conv5_mask.in_channels
    hidden_layer = 256
    # and replace the mask predictor with a new one
    model.roi_heads.mask_predictor = MaskRCNNPredictor(in_features_mask,
                                                       hidden_layer,
                                                       num_classes)

    return model
In [7]:
def get_transform(train):
    transforms = []
    transforms.append(T.ToTensor())
    return T.Compose(transforms)
In [8]:
def collate_fn(batch):
    return tuple(zip(*batch))
In [9]:
# our dataset has two classes only - background and hand
num_classes = 2
dataset = HandsDataset('egohands', get_transform(train=True))
dataset_test = HandsDataset('egohands', get_transform(train=False))

# split the dataset in train and test set
indices = torch.randperm(len(dataset)).tolist()
dataset = torch.utils.data.Subset(dataset, indices[:-50])
dataset_test = torch.utils.data.Subset(dataset_test, indices[-50:])

# define training and validation data loaders
data_loader = torch.utils.data.DataLoader(
    dataset, batch_size=2, shuffle=True, num_workers=1,
    collate_fn=collate_fn)

data_loader_test = torch.utils.data.DataLoader(
    dataset_test, batch_size=1, shuffle=False, num_workers=1,
    collate_fn=collate_fn)

Let's visualize what we get from our pytorch dataset

In [15]:
dataset[0]
<Figure size 144x144 with 0 Axes>
Out[15]:
(tensor([[[0.1412, 0.1412, 0.1412,  ..., 0.0980, 0.0980, 0.0980],
          [0.1412, 0.1412, 0.1412,  ..., 0.1020, 0.1020, 0.0980],
          [0.1412, 0.1412, 0.1412,  ..., 0.1059, 0.1059, 0.1020],
          ...,
          [0.2745, 0.2745, 0.2745,  ..., 0.4314, 0.4275, 0.4235],
          [0.2745, 0.2745, 0.2745,  ..., 0.4314, 0.4275, 0.4235],
          [0.2745, 0.2745, 0.2745,  ..., 0.4314, 0.4275, 0.4235]],
 
         [[0.1882, 0.1882, 0.1882,  ..., 0.1098, 0.1098, 0.1098],
          [0.1882, 0.1882, 0.1882,  ..., 0.1137, 0.1137, 0.1098],
          [0.1882, 0.1882, 0.1882,  ..., 0.1176, 0.1176, 0.1137],
          ...,
          [0.4784, 0.4784, 0.4784,  ..., 0.4196, 0.4157, 0.4118],
          [0.4784, 0.4784, 0.4784,  ..., 0.4196, 0.4157, 0.4118],
          [0.4784, 0.4784, 0.4784,  ..., 0.4196, 0.4157, 0.4118]],
 
         [[0.1725, 0.1725, 0.1725,  ..., 0.0824, 0.0824, 0.0824],
          [0.1725, 0.1725, 0.1725,  ..., 0.0863, 0.0863, 0.0824],
          [0.1725, 0.1725, 0.1725,  ..., 0.0902, 0.0902, 0.0863],
          ...,
          [0.6745, 0.6745, 0.6745,  ..., 0.3529, 0.3490, 0.3451],
          [0.6745, 0.6745, 0.6745,  ..., 0.3529, 0.3490, 0.3451],
          [0.6745, 0.6745, 0.6745,  ..., 0.3529, 0.3490, 0.3451]]]),
 {'boxes': tensor([[480., 331., 694., 593.],
          [770., 392., 984., 645.]]),
  'image_id': tensor([3873]),
  'iscrowd': tensor([0, 0]),
  'labels': tensor([1, 1]),
  'masks': tensor([[[0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           ...,
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0]],
  
          [[0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           ...,
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0],
           [0, 0, 0,  ..., 0, 0, 0]]], dtype=torch.uint8)})

We get masks, boxes, labels along with other information. This is what we need for training.

In [ ]:
 

Lets see predictions before training model.

In [ ]:
# Helper functions to draw predictions:

import torchvision.transforms as T
def plot_mask_rcnn_result(img_path, threshold=0.5, rect_th=3, text_size=3, text_th=3):
    masks, boxes, pred_cls = get_prediction(img_path, threshold)
    img = cv2.imread(img_path)
    img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
    for i in range(len(masks)):
        rgb_mask = random_colour_masks(masks[i])
        img = cv2.addWeighted(img, 1, rgb_mask, 0.5, 0)
        cv2.rectangle(img, boxes[i][0], boxes[i][1],color=(0, 255, 0), thickness=rect_th)
    plt.figure(figsize=(20,30))
    plt.imshow(img)
    plt.xticks([])
    plt.yticks([])
    plt.show()

def random_colour_masks(image):
    colours = [[0, 255, 0],[0, 0, 255],[255, 0, 0],[0, 255, 255],[255, 255, 0],[255, 0, 255],[80, 70, 180],[250, 80, 190],[245, 145, 50],[70, 150, 250],[50, 190, 190]]
    r = np.zeros_like(image).astype(np.uint8)
    g = np.zeros_like(image).astype(np.uint8)
    b = np.zeros_like(image).astype(np.uint8)
    r[image == 1], g[image == 1], b[image == 1] = colours[random.randrange(0,10)]
    coloured_mask = np.stack([r, g, b], axis=2)
    return coloured_mask

def get_prediction(img_path, threshold):
    img = Image.open(img_path)
    transform = T.Compose([T.ToTensor()])
    img = transform(img)
    pred = model([img.to(device)])
    pred_score = list(pred[0]['scores'].detach().cpu().numpy())
    pred_t = [pred_score.index(x) for x in pred_score if x>threshold][-1]
    masks = (pred[0]['masks']>0.5).squeeze().detach().cpu().numpy()
    pred_class = [i for i in list(pred[0]['labels'].cpu().numpy())]
    pred_boxes = [[(i[0], i[1]), (i[2], i[3])] for i in list(pred[0]['boxes'].detach().cpu().numpy())]
    masks = masks[:pred_t+1]
    pred_boxes = pred_boxes[:pred_t+1]
    pred_class = pred_class[:pred_t+1]  
    return masks, pred_boxes, pred_class
In [20]:
model = get_model_instance_segmentation(num_classes)
model.to(device)
model.eval()
plot_mask_rcnn_result('egohands/DATA_IMAGES/Image9_26.jpg', threshold=0.7)
In [21]:
plot_mask_rcnn_result('egohands/DATA_IMAGES/Image10_26.jpg', threshold=0.6)

These are definitely not good. Lets train the model for 20 epochs now.

In [22]:
model = get_model_instance_segmentation(num_classes)
model.to(device)
params = [p for p in model.parameters() if p.requires_grad]
optimizer = torch.optim.Adam(params, lr=0.005)
lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer,step_size=3,gamma=0.1)
num_epochs = 20
for epoch in range(num_epochs):
    train_one_epoch(model, optimizer, data_loader, device, epoch, print_freq=10)
    lr_scheduler.step()
Epoch: [0]  [   0/2375]  eta: 0:42:34  loss_objectness: 0.0309 (0.0309)  loss: 4.6981 (4.6981)  loss_rpn_box_reg: 0.0121 (0.0121)  loss_mask: 3.6880 (3.6880)  loss_box_reg: 0.2721 (0.2721)  loss_classifier: 0.6950 (0.6950)  lr: 0.000010  time: 1.0757  data: 0.3580  max mem: 4701
Epoch: [0]  [  10/2375]  eta: 0:26:14  loss_objectness: 0.0611 (0.0613)  loss: 3.3104 (3.6030)  loss_rpn_box_reg: 0.0195 (0.0201)  loss_mask: 2.5056 (2.7322)  loss_box_reg: 0.1786 (0.1699)  loss_classifier: 0.6288 (0.6195)  lr: 0.000060  time: 0.6656  data: 0.0390  max mem: 4978
some issue here. skipping.
Epoch: [1]  [   0/2375]  eta: 0:37:45  loss_objectness: 0.0631 (0.0631)  loss: 1.2528 (1.2528)  loss_rpn_box_reg: 0.0157 (0.0157)  loss_mask: 0.7181 (0.7181)  loss_box_reg: 0.1618 (0.1618)  loss_classifier: 0.2941 (0.2941)  lr: 0.005000  time: 0.9539  data: 0.3145  max mem: 4978
Epoch: [1]  [  10/2375]  eta: 0:25:39  loss_objectness: 0.1000 (0.1337)  loss: 1.2528 (1.4175)  loss_rpn_box_reg: 0.0263 (0.0313)  loss_mask: 0.7181 (0.8527)  loss_box_reg: 0.1396 (0.1278)  loss_classifier: 0.2761 (0.2720)  lr: 0.005000  time: 0.6510  data: 0.0358  max mem: 4978
Epoch: [1]  [  20/2375]  eta: 0:24:56  loss_objectness: 0.0806 (0.1065)  loss: 1.0511 (1.2091)  loss_rpn_box_reg: 0.0260 (0.0285)  loss_mask: 0.6334 (0.7403)  loss_box_reg: 0.0898 (0.1148)  loss_classifier: 0.1577 (0.2190)  lr: 0.005000  time: 0.6193  data: 0.0076  max mem: 4978
some issue here. skipping.
Epoch: [2]  [   0/2375]  eta: 0:36:27  loss_objectness: 0.0692 (0.0692)  loss: 0.8996 (0.8996)  loss_rpn_box_reg: 0.0477 (0.0477)  loss_mask: 0.4800 (0.4800)  loss_box_reg: 0.1407 (0.1407)  loss_classifier: 0.1619 (0.1619)  lr: 0.005000  time: 0.9209  data: 0.2830  max mem: 4978
Epoch: [2]  [  10/2375]  eta: 0:25:57  loss_objectness: 0.0495 (0.0648)  loss: 0.9930 (1.0151)  loss_rpn_box_reg: 0.0274 (0.0388)  loss_mask: 0.4800 (0.4827)  loss_box_reg: 0.2218 (0.2176)  loss_classifier: 0.1762 (0.2111)  lr: 0.005000  time: 0.6587  data: 0.0323  max mem: 4978
Epoch: [2]  [  20/2375]  eta: 0:25:32  loss_objectness: 0.0408 (0.0529)  loss: 0.9233 (0.9711)  loss_rpn_box_reg: 0.0240 (0.0307)  loss_mask: 0.4326 (0.4715)  loss_box_reg: 0.2096 (0.2156)  loss_classifier: 0.1762 (0.2003)  lr: 0.005000  time: 0.6373  data: 0.0073  max mem: 4978
Epoch: [2]  [  30/2375]  eta: 0:25:21  loss_objectness: 0.0349 (0.0561)  loss: 0.9112 (0.9873)  loss_rpn_box_reg: 0.0211 (0.0300)  loss_mask: 0.4282 (0.4744)  loss_box_reg: 0.2253 (0.2193)  loss_classifier: 0.1974 (0.2075)  lr: 0.005000  time: 0.6437  data: 0.0077  max mem: 4978
Epoch: [2]  [  40/2375]  eta: 0:25:11  loss_objectness: 0.0232 (0.0474)  loss: 0.8657 (0.9343)  loss_rpn_box_reg: 0.0211 (0.0272)  loss_mask: 0.4068 (0.4453)  loss_box_reg: 0.2315 (0.2189)  loss_classifier: 0.1871 (0.1954)  lr: 0.005000  time: 0.6439  data: 0.0077  max mem: 4978
Epoch: [2]  [  50/2375]  eta: 0:25:01  loss_objectness: 0.0173 (0.0425)  loss: 0.7988 (0.9107)  loss_rpn_box_reg: 0.0213 (0.0261)  loss_mask: 0.3525 (0.4270)  loss_box_reg: 0.2334 (0.2237)  loss_classifier: 0.1711 (0.1915)  lr: 0.005000  time: 0.6413  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [3]  [   0/2375]  eta: 0:37:27  loss_objectness: 0.0319 (0.0319)  loss: 1.0145 (1.0145)  loss_rpn_box_reg: 0.0635 (0.0635)  loss_mask: 0.3802 (0.3802)  loss_box_reg: 0.2724 (0.2724)  loss_classifier: 0.2664 (0.2664)  lr: 0.000500  time: 0.9462  data: 0.2912  max mem: 4978
Epoch: [3]  [  10/2375]  eta: 0:26:29  loss_objectness: 0.0110 (0.0167)  loss: 0.7972 (0.8150)  loss_rpn_box_reg: 0.0204 (0.0234)  loss_mask: 0.3319 (0.3393)  loss_box_reg: 0.2660 (0.2529)  loss_classifier: 0.1702 (0.1827)  lr: 0.000500  time: 0.6723  data: 0.0337  max mem: 4978
Epoch: [3]  [  20/2375]  eta: 0:25:50  loss_objectness: 0.0111 (0.0168)  loss: 0.6901 (0.7473)  loss_rpn_box_reg: 0.0165 (0.0213)  loss_mask: 0.2862 (0.3075)  loss_box_reg: 0.2266 (0.2355)  loss_classifier: 0.1519 (0.1662)  lr: 0.000500  time: 0.6442  data: 0.0076  max mem: 4978
Epoch: [3]  [  30/2375]  eta: 0:25:35  loss_objectness: 0.0129 (0.0159)  loss: 0.6428 (0.7296)  loss_rpn_box_reg: 0.0177 (0.0210)  loss_mask: 0.2770 (0.3050)  loss_box_reg: 0.2214 (0.2306)  loss_classifier: 0.1361 (0.1570)  lr: 0.000500  time: 0.6451  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [4]  [   0/2375]  eta: 0:38:25  loss_objectness: 0.0183 (0.0183)  loss: 0.9193 (0.9193)  loss_rpn_box_reg: 0.0411 (0.0411)  loss_mask: 0.3745 (0.3745)  loss_box_reg: 0.2687 (0.2687)  loss_classifier: 0.2167 (0.2167)  lr: 0.000500  time: 0.9707  data: 0.3084  max mem: 4978
Epoch: [4]  [  10/2375]  eta: 0:26:22  loss_objectness: 0.0093 (0.0103)  loss: 0.6321 (0.6649)  loss_rpn_box_reg: 0.0190 (0.0204)  loss_mask: 0.2992 (0.3054)  loss_box_reg: 0.1976 (0.1959)  loss_classifier: 0.1421 (0.1328)  lr: 0.000500  time: 0.6692  data: 0.0347  max mem: 4978
Epoch: [4]  [  20/2375]  eta: 0:25:51  loss_objectness: 0.0083 (0.0109)  loss: 0.6202 (0.6533)  loss_rpn_box_reg: 0.0181 (0.0205)  loss_mask: 0.3023 (0.3054)  loss_box_reg: 0.1782 (0.1867)  loss_classifier: 0.1227 (0.1297)  lr: 0.000500  time: 0.6432  data: 0.0073  max mem: 4978
Epoch: [4]  [  30/2375]  eta: 0:25:31  loss_objectness: 0.0076 (0.0124)  loss: 0.6327 (0.6376)  loss_rpn_box_reg: 0.0172 (0.0190)  loss_mask: 0.2999 (0.2976)  loss_box_reg: 0.1806 (0.1798)  loss_classifier: 0.1278 (0.1288)  lr: 0.000500  time: 0.6445  data: 0.0076  max mem: 4978
Epoch: [4]  [  40/2375]  eta: 0:25:19  loss_objectness: 0.0097 (0.0139)  loss: 0.6526 (0.6507)  loss_rpn_box_reg: 0.0160 (0.0195)  loss_mask: 0.2922 (0.3022)  loss_box_reg: 0.1719 (0.1794)  loss_classifier: 0.1425 (0.1358)  lr: 0.000500  time: 0.6427  data: 0.0076  max mem: 4978
Epoch: [4]  [  50/2375]  eta: 0:25:08  loss_objectness: 0.0097 (0.0139)  loss: 0.6231 (0.6412)  loss_rpn_box_reg: 0.0160 (0.0187)  loss_mask: 0.2922 (0.2994)  loss_box_reg: 0.1537 (0.1747)  loss_classifier: 0.1346 (0.1345)  lr: 0.000500  time: 0.6424  data: 0.0073  max mem: 4978
Epoch: [4]  [  60/2375]  eta: 0:25:02  loss_objectness: 0.0085 (0.0167)  loss: 0.6216 (0.6402)  loss_rpn_box_reg: 0.0146 (0.0185)  loss_mask: 0.2880 (0.2997)  loss_box_reg: 0.1420 (0.1711)  loss_classifier: 0.1320 (0.1342)  lr: 0.000500  time: 0.6447  data: 0.0076  max mem: 4978
Epoch: [4]  [  70/2375]  eta: 0:24:52  loss_objectness: 0.0066 (0.0158)  loss: 0.6159 (0.6338)  loss_rpn_box_reg: 0.0156 (0.0181)  loss_mask: 0.2880 (0.3002)  loss_box_reg: 0.1438 (0.1671)  loss_classifier: 0.1256 (0.1325)  lr: 0.000500  time: 0.6441  data: 0.0081  max mem: 4978
Epoch: [4]  [  80/2375]  eta: 0:24:42  loss_objectness: 0.0092 (0.0156)  loss: 0.5838 (0.6310)  loss_rpn_box_reg: 0.0156 (0.0178)  loss_mask: 0.2830 (0.3003)  loss_box_reg: 0.1381 (0.1642)  loss_classifier: 0.1292 (0.1332)  lr: 0.000500  time: 0.6373  data: 0.0077  max mem: 4978
Epoch: [4]  [  90/2375]  eta: 0:24:35  loss_objectness: 0.0092 (0.0150)  loss: 0.5789 (0.6254)  loss_rpn_box_reg: 0.0163 (0.0179)  loss_mask: 0.2871 (0.2981)  loss_box_reg: 0.1381 (0.1615)  loss_classifier: 0.1292 (0.1329)  lr: 0.000500  time: 0.6385  data: 0.0070  max mem: 4978
Epoch: [4]  [ 100/2375]  eta: 0:24:29  loss_objectness: 0.0083 (0.0145)  loss: 0.5673 (0.6225)  loss_rpn_box_reg: 0.0140 (0.0177)  loss_mask: 0.2991 (0.3000)  loss_box_reg: 0.1283 (0.1581)  loss_classifier: 0.1199 (0.1322)  lr: 0.000500  time: 0.6454  data: 0.0070  max mem: 4978
Epoch: [4]  [ 110/2375]  eta: 0:24:22  loss_objectness: 0.0075 (0.0142)  loss: 0.5310 (0.6147)  loss_rpn_box_reg: 0.0124 (0.0175)  loss_mask: 0.2898 (0.2971)  loss_box_reg: 0.1283 (0.1555)  loss_classifier: 0.1117 (0.1304)  lr: 0.000500  time: 0.6454  data: 0.0072  max mem: 4978
Epoch: [4]  [ 120/2375]  eta: 0:24:15  loss_objectness: 0.0077 (0.0147)  loss: 0.5310 (0.6092)  loss_rpn_box_reg: 0.0173 (0.0184)  loss_mask: 0.2786 (0.2946)  loss_box_reg: 0.1246 (0.1524)  loss_classifier: 0.1081 (0.1291)  lr: 0.000500  time: 0.6433  data: 0.0072  max mem: 4978
Epoch: [4]  [ 130/2375]  eta: 0:24:08  loss_objectness: 0.0075 (0.0143)  loss: 0.5454 (0.6063)  loss_rpn_box_reg: 0.0145 (0.0182)  loss_mask: 0.2830 (0.2947)  loss_box_reg: 0.1187 (0.1502)  loss_classifier: 0.1209 (0.1289)  lr: 0.000500  time: 0.6432  data: 0.0072  max mem: 4978
Epoch: [4]  [ 140/2375]  eta: 0:24:03  loss_objectness: 0.0055 (0.0139)  loss: 0.5147 (0.6011)  loss_rpn_box_reg: 0.0133 (0.0181)  loss_mask: 0.2775 (0.2932)  loss_box_reg: 0.1187 (0.1482)  loss_classifier: 0.1209 (0.1277)  lr: 0.000500  time: 0.6479  data: 0.0072  max mem: 4978
Epoch: [4]  [ 150/2375]  eta: 0:23:55  loss_objectness: 0.0055 (0.0138)  loss: 0.4923 (0.5951)  loss_rpn_box_reg: 0.0152 (0.0181)  loss_mask: 0.2635 (0.2913)  loss_box_reg: 0.0990 (0.1451)  loss_classifier: 0.1033 (0.1268)  lr: 0.000500  time: 0.6448  data: 0.0072  max mem: 4978
Epoch: [4]  [ 160/2375]  eta: 0:23:48  loss_objectness: 0.0063 (0.0143)  loss: 0.5444 (0.5948)  loss_rpn_box_reg: 0.0178 (0.0182)  loss_mask: 0.2733 (0.2909)  loss_box_reg: 0.1115 (0.1439)  loss_classifier: 0.1079 (0.1275)  lr: 0.000500  time: 0.6387  data: 0.0073  max mem: 4978
Epoch: [4]  [ 170/2375]  eta: 0:23:41  loss_objectness: 0.0079 (0.0140)  loss: 0.5239 (0.5904)  loss_rpn_box_reg: 0.0181 (0.0181)  loss_mask: 0.2735 (0.2895)  loss_box_reg: 0.1115 (0.1420)  loss_classifier: 0.1286 (0.1268)  lr: 0.000500  time: 0.6390  data: 0.0071  max mem: 4978
Epoch: [4]  [ 180/2375]  eta: 0:23:34  loss_objectness: 0.0075 (0.0138)  loss: 0.4905 (0.5863)  loss_rpn_box_reg: 0.0134 (0.0180)  loss_mask: 0.2590 (0.2883)  loss_box_reg: 0.1010 (0.1401)  loss_classifier: 0.1138 (0.1261)  lr: 0.000500  time: 0.6403  data: 0.0071  max mem: 4978
Epoch: [4]  [ 190/2375]  eta: 0:23:27  loss_objectness: 0.0069 (0.0135)  loss: 0.5078 (0.5838)  loss_rpn_box_reg: 0.0151 (0.0180)  loss_mask: 0.2543 (0.2868)  loss_box_reg: 0.1089 (0.1392)  loss_classifier: 0.1161 (0.1264)  lr: 0.000500  time: 0.6435  data: 0.0074  max mem: 4978
Epoch: [4]  [ 200/2375]  eta: 0:23:20  loss_objectness: 0.0053 (0.0133)  loss: 0.5445 (0.5829)  loss_rpn_box_reg: 0.0166 (0.0181)  loss_mask: 0.2574 (0.2879)  loss_box_reg: 0.1064 (0.1378)  loss_classifier: 0.1183 (0.1257)  lr: 0.000500  time: 0.6405  data: 0.0075  max mem: 4978
Epoch: [4]  [ 210/2375]  eta: 0:23:14  loss_objectness: 0.0070 (0.0134)  loss: 0.5560 (0.5804)  loss_rpn_box_reg: 0.0120 (0.0179)  loss_mask: 0.2911 (0.2881)  loss_box_reg: 0.1055 (0.1362)  loss_classifier: 0.0965 (0.1249)  lr: 0.000500  time: 0.6398  data: 0.0072  max mem: 4978
Epoch: [4]  [ 220/2375]  eta: 0:23:08  loss_objectness: 0.0051 (0.0133)  loss: 0.4768 (0.5775)  loss_rpn_box_reg: 0.0103 (0.0177)  loss_mask: 0.2704 (0.2874)  loss_box_reg: 0.1045 (0.1349)  loss_classifier: 0.0924 (0.1242)  lr: 0.000500  time: 0.6486  data: 0.0073  max mem: 4978
Epoch: [4]  [ 230/2375]  eta: 0:23:01  loss_objectness: 0.0038 (0.0136)  loss: 0.4768 (0.5740)  loss_rpn_box_reg: 0.0105 (0.0176)  loss_mask: 0.2611 (0.2860)  loss_box_reg: 0.0974 (0.1331)  loss_classifier: 0.1034 (0.1237)  lr: 0.000500  time: 0.6460  data: 0.0073  max mem: 4978
Epoch: [4]  [ 240/2375]  eta: 0:22:55  loss_objectness: 0.0041 (0.0141)  loss: 0.5396 (0.5730)  loss_rpn_box_reg: 0.0143 (0.0176)  loss_mask: 0.2622 (0.2853)  loss_box_reg: 0.1019 (0.1324)  loss_classifier: 0.1128 (0.1236)  lr: 0.000500  time: 0.6407  data: 0.0073  max mem: 4978
Epoch: [4]  [ 250/2375]  eta: 0:22:48  loss_objectness: 0.0065 (0.0138)  loss: 0.5236 (0.5707)  loss_rpn_box_reg: 0.0165 (0.0177)  loss_mask: 0.2622 (0.2849)  loss_box_reg: 0.1040 (0.1313)  loss_classifier: 0.1088 (0.1229)  lr: 0.000500  time: 0.6456  data: 0.0082  max mem: 4978
Epoch: [4]  [ 260/2375]  eta: 0:22:42  loss_objectness: 0.0065 (0.0140)  loss: 0.4897 (0.5679)  loss_rpn_box_reg: 0.0162 (0.0179)  loss_mask: 0.2612 (0.2837)  loss_box_reg: 0.1013 (0.1299)  loss_classifier: 0.1014 (0.1224)  lr: 0.000500  time: 0.6474  data: 0.0081  max mem: 4978
Epoch: [4]  [ 270/2375]  eta: 0:22:36  loss_objectness: 0.0052 (0.0142)  loss: 0.4723 (0.5656)  loss_rpn_box_reg: 0.0150 (0.0179)  loss_mask: 0.2580 (0.2830)  loss_box_reg: 0.0880 (0.1287)  loss_classifier: 0.1036 (0.1218)  lr: 0.000500  time: 0.6444  data: 0.0073  max mem: 4978
Epoch: [4]  [ 280/2375]  eta: 0:22:29  loss_objectness: 0.0069 (0.0140)  loss: 0.4495 (0.5617)  loss_rpn_box_reg: 0.0165 (0.0178)  loss_mask: 0.2480 (0.2813)  loss_box_reg: 0.0880 (0.1276)  loss_classifier: 0.0901 (0.1210)  lr: 0.000500  time: 0.6416  data: 0.0073  max mem: 4978
Epoch: [4]  [ 290/2375]  eta: 0:22:23  loss_objectness: 0.0077 (0.0139)  loss: 0.4817 (0.5606)  loss_rpn_box_reg: 0.0131 (0.0178)  loss_mask: 0.2576 (0.2810)  loss_box_reg: 0.1046 (0.1272)  loss_classifier: 0.1016 (0.1207)  lr: 0.000500  time: 0.6433  data: 0.0072  max mem: 4978
Epoch: [4]  [ 300/2375]  eta: 0:22:16  loss_objectness: 0.0082 (0.0137)  loss: 0.4933 (0.5587)  loss_rpn_box_reg: 0.0118 (0.0177)  loss_mask: 0.2701 (0.2805)  loss_box_reg: 0.1080 (0.1265)  loss_classifier: 0.1016 (0.1203)  lr: 0.000500  time: 0.6452  data: 0.0072  max mem: 4978
Epoch: [4]  [ 310/2375]  eta: 0:22:09  loss_objectness: 0.0082 (0.0139)  loss: 0.5089 (0.5589)  loss_rpn_box_reg: 0.0137 (0.0177)  loss_mask: 0.2664 (0.2806)  loss_box_reg: 0.1030 (0.1261)  loss_classifier: 0.1231 (0.1208)  lr: 0.000500  time: 0.6410  data: 0.0072  max mem: 4978
Epoch: [4]  [ 320/2375]  eta: 0:22:03  loss_objectness: 0.0074 (0.0139)  loss: 0.5127 (0.5570)  loss_rpn_box_reg: 0.0127 (0.0175)  loss_mask: 0.2609 (0.2797)  loss_box_reg: 0.1030 (0.1255)  loss_classifier: 0.1231 (0.1204)  lr: 0.000500  time: 0.6384  data: 0.0072  max mem: 4978
Epoch: [4]  [ 330/2375]  eta: 0:21:56  loss_objectness: 0.0057 (0.0141)  loss: 0.4752 (0.5554)  loss_rpn_box_reg: 0.0121 (0.0174)  loss_mask: 0.2528 (0.2793)  loss_box_reg: 0.0937 (0.1245)  loss_classifier: 0.1049 (0.1201)  lr: 0.000500  time: 0.6421  data: 0.0072  max mem: 4978
Epoch: [4]  [ 340/2375]  eta: 0:21:49  loss_objectness: 0.0057 (0.0140)  loss: 0.4783 (0.5553)  loss_rpn_box_reg: 0.0139 (0.0174)  loss_mask: 0.2857 (0.2801)  loss_box_reg: 0.0933 (0.1237)  loss_classifier: 0.1106 (0.1201)  lr: 0.000500  time: 0.6412  data: 0.0072  max mem: 4978
Epoch: [4]  [ 350/2375]  eta: 0:21:44  loss_objectness: 0.0044 (0.0138)  loss: 0.5023 (0.5532)  loss_rpn_box_reg: 0.0123 (0.0172)  loss_mask: 0.2791 (0.2796)  loss_box_reg: 0.0933 (0.1228)  loss_classifier: 0.1127 (0.1197)  lr: 0.000500  time: 0.6457  data: 0.0077  max mem: 4978
Epoch: [4]  [ 360/2375]  eta: 0:21:37  loss_objectness: 0.0059 (0.0138)  loss: 0.5181 (0.5532)  loss_rpn_box_reg: 0.0123 (0.0172)  loss_mask: 0.2755 (0.2796)  loss_box_reg: 0.1036 (0.1225)  loss_classifier: 0.1113 (0.1201)  lr: 0.000500  time: 0.6476  data: 0.0077  max mem: 4978
Epoch: [4]  [ 370/2375]  eta: 0:21:31  loss_objectness: 0.0062 (0.0136)  loss: 0.5257 (0.5516)  loss_rpn_box_reg: 0.0145 (0.0171)  loss_mask: 0.2744 (0.2792)  loss_box_reg: 0.1042 (0.1218)  loss_classifier: 0.1206 (0.1198)  lr: 0.000500  time: 0.6452  data: 0.0072  max mem: 4978
Epoch: [4]  [ 380/2375]  eta: 0:21:24  loss_objectness: 0.0065 (0.0135)  loss: 0.4825 (0.5510)  loss_rpn_box_reg: 0.0142 (0.0170)  loss_mask: 0.2593 (0.2791)  loss_box_reg: 0.0871 (0.1214)  loss_classifier: 0.1206 (0.1199)  lr: 0.000500  time: 0.6412  data: 0.0073  max mem: 4978
Epoch: [4]  [ 390/2375]  eta: 0:21:17  loss_objectness: 0.0079 (0.0137)  loss: 0.4739 (0.5494)  loss_rpn_box_reg: 0.0143 (0.0172)  loss_mask: 0.2593 (0.2788)  loss_box_reg: 0.0840 (0.1204)  loss_classifier: 0.0943 (0.1193)  lr: 0.000500  time: 0.6331  data: 0.0073  max mem: 4978
Epoch: [4]  [ 400/2375]  eta: 0:21:11  loss_objectness: 0.0074 (0.0136)  loss: 0.4765 (0.5481)  loss_rpn_box_reg: 0.0144 (0.0172)  loss_mask: 0.2715 (0.2785)  loss_box_reg: 0.0758 (0.1198)  loss_classifier: 0.0926 (0.1190)  lr: 0.000500  time: 0.6391  data: 0.0076  max mem: 4978
Epoch: [4]  [ 410/2375]  eta: 0:21:04  loss_objectness: 0.0047 (0.0134)  loss: 0.4543 (0.5458)  loss_rpn_box_reg: 0.0135 (0.0172)  loss_mask: 0.2503 (0.2778)  loss_box_reg: 0.0842 (0.1191)  loss_classifier: 0.0917 (0.1183)  lr: 0.000500  time: 0.6450  data: 0.0075  max mem: 4978
Epoch: [4]  [ 420/2375]  eta: 0:20:58  loss_objectness: 0.0047 (0.0134)  loss: 0.4722 (0.5452)  loss_rpn_box_reg: 0.0135 (0.0171)  loss_mask: 0.2503 (0.2778)  loss_box_reg: 0.0842 (0.1185)  loss_classifier: 0.1006 (0.1184)  lr: 0.000500  time: 0.6423  data: 0.0073  max mem: 4978
Epoch: [4]  [ 430/2375]  eta: 0:20:51  loss_objectness: 0.0050 (0.0132)  loss: 0.4781 (0.5445)  loss_rpn_box_reg: 0.0139 (0.0172)  loss_mask: 0.2645 (0.2771)  loss_box_reg: 0.0886 (0.1184)  loss_classifier: 0.1099 (0.1186)  lr: 0.000500  time: 0.6436  data: 0.0073  max mem: 4978
some issue here. skipping.
Epoch: [5]  [   0/2375]  eta: 0:40:00  loss_objectness: 0.0043 (0.0043)  loss: 0.6026 (0.6026)  loss_rpn_box_reg: 0.0237 (0.0237)  loss_mask: 0.3477 (0.3477)  loss_box_reg: 0.1021 (0.1021)  loss_classifier: 0.1248 (0.1248)  lr: 0.000500  time: 1.0108  data: 0.3478  max mem: 4978
Epoch: [5]  [  10/2375]  eta: 0:26:37  loss_objectness: 0.0031 (0.0051)  loss: 0.4157 (0.4251)  loss_rpn_box_reg: 0.0099 (0.0145)  loss_mask: 0.2121 (0.2314)  loss_box_reg: 0.0735 (0.0813)  loss_classifier: 0.0853 (0.0928)  lr: 0.000500  time: 0.6753  data: 0.0382  max mem: 4978
Epoch: [5]  [  20/2375]  eta: 0:26:05  loss_objectness: 0.0030 (0.0046)  loss: 0.4212 (0.4389)  loss_rpn_box_reg: 0.0099 (0.0134)  loss_mask: 0.2395 (0.2417)  loss_box_reg: 0.0735 (0.0840)  loss_classifier: 0.0853 (0.0953)  lr: 0.000500  time: 0.6477  data: 0.0072  max mem: 4978
Epoch: [5]  [  30/2375]  eta: 0:25:51  loss_objectness: 0.0035 (0.0135)  loss: 0.4729 (0.4818)  loss_rpn_box_reg: 0.0109 (0.0149)  loss_mask: 0.2610 (0.2562)  loss_box_reg: 0.1024 (0.0939)  loss_classifier: 0.0997 (0.1033)  lr: 0.000500  time: 0.6539  data: 0.0073  max mem: 4978
Epoch: [5]  [  40/2375]  eta: 0:25:32  loss_objectness: 0.0094 (0.0138)  loss: 0.5250 (0.4928)  loss_rpn_box_reg: 0.0122 (0.0155)  loss_mask: 0.2610 (0.2569)  loss_box_reg: 0.1122 (0.0992)  loss_classifier: 0.1062 (0.1074)  lr: 0.000500  time: 0.6468  data: 0.0073  max mem: 4978
Epoch: [5]  [  50/2375]  eta: 0:25:19  loss_objectness: 0.0047 (0.0119)  loss: 0.4181 (0.4767)  loss_rpn_box_reg: 0.0112 (0.0145)  loss_mask: 0.2324 (0.2548)  loss_box_reg: 0.0795 (0.0934)  loss_classifier: 0.0943 (0.1020)  lr: 0.000500  time: 0.6414  data: 0.0073  max mem: 4978
Epoch: [5]  [  60/2375]  eta: 0:25:13  loss_objectness: 0.0044 (0.0112)  loss: 0.4353 (0.4810)  loss_rpn_box_reg: 0.0108 (0.0146)  loss_mask: 0.2523 (0.2558)  loss_box_reg: 0.0795 (0.0952)  loss_classifier: 0.0943 (0.1043)  lr: 0.000500  time: 0.6495  data: 0.0073  max mem: 4978
Epoch: [5]  [  70/2375]  eta: 0:25:06  loss_objectness: 0.0064 (0.0124)  loss: 0.4972 (0.4925)  loss_rpn_box_reg: 0.0120 (0.0150)  loss_mask: 0.2606 (0.2575)  loss_box_reg: 0.1054 (0.0983)  loss_classifier: 0.1134 (0.1092)  lr: 0.000500  time: 0.6530  data: 0.0073  max mem: 4978
Epoch: [5]  [  80/2375]  eta: 0:24:57  loss_objectness: 0.0064 (0.0118)  loss: 0.5145 (0.4905)  loss_rpn_box_reg: 0.0120 (0.0151)  loss_mask: 0.2560 (0.2570)  loss_box_reg: 0.1008 (0.0971)  loss_classifier: 0.1155 (0.1095)  lr: 0.000500  time: 0.6481  data: 0.0072  max mem: 4978
Epoch: [5]  [  90/2375]  eta: 0:24:47  loss_objectness: 0.0064 (0.0112)  loss: 0.4691 (0.4863)  loss_rpn_box_reg: 0.0101 (0.0144)  loss_mask: 0.2628 (0.2596)  loss_box_reg: 0.0673 (0.0944)  loss_classifier: 0.0887 (0.1067)  lr: 0.000500  time: 0.6431  data: 0.0073  max mem: 4978
Epoch: [5]  [ 100/2375]  eta: 0:24:42  loss_objectness: 0.0048 (0.0108)  loss: 0.4653 (0.4860)  loss_rpn_box_reg: 0.0102 (0.0145)  loss_mask: 0.2674 (0.2594)  loss_box_reg: 0.0724 (0.0946)  loss_classifier: 0.0884 (0.1067)  lr: 0.000500  time: 0.6488  data: 0.0073  max mem: 4978
Epoch: [5]  [ 110/2375]  eta: 0:24:35  loss_objectness: 0.0060 (0.0119)  loss: 0.4783 (0.4882)  loss_rpn_box_reg: 0.0121 (0.0144)  loss_mask: 0.2536 (0.2597)  loss_box_reg: 0.0879 (0.0951)  loss_classifier: 0.1044 (0.1072)  lr: 0.000500  time: 0.6536  data: 0.0075  max mem: 4978
Epoch: [5]  [ 120/2375]  eta: 0:24:28  loss_objectness: 0.0071 (0.0116)  loss: 0.4563 (0.4863)  loss_rpn_box_reg: 0.0103 (0.0142)  loss_mask: 0.2510 (0.2582)  loss_box_reg: 0.0777 (0.0945)  loss_classifier: 0.1014 (0.1078)  lr: 0.000500  time: 0.6490  data: 0.0074  max mem: 4978
Epoch: [5]  [ 130/2375]  eta: 0:24:21  loss_objectness: 0.0056 (0.0116)  loss: 0.4399 (0.4845)  loss_rpn_box_reg: 0.0114 (0.0143)  loss_mask: 0.2284 (0.2579)  loss_box_reg: 0.0765 (0.0938)  loss_classifier: 0.0988 (0.1069)  lr: 0.000500  time: 0.6468  data: 0.0071  max mem: 4978
Epoch: [5]  [ 140/2375]  eta: 0:24:16  loss_objectness: 0.0056 (0.0112)  loss: 0.4126 (0.4811)  loss_rpn_box_reg: 0.0128 (0.0142)  loss_mask: 0.2191 (0.2561)  loss_box_reg: 0.0749 (0.0931)  loss_classifier: 0.0963 (0.1066)  lr: 0.000500  time: 0.6534  data: 0.0072  max mem: 4978
Epoch: [5]  [ 150/2375]  eta: 0:24:08  loss_objectness: 0.0061 (0.0119)  loss: 0.4126 (0.4819)  loss_rpn_box_reg: 0.0128 (0.0144)  loss_mask: 0.2365 (0.2562)  loss_box_reg: 0.0749 (0.0925)  loss_classifier: 0.0963 (0.1068)  lr: 0.000500  time: 0.6518  data: 0.0078  max mem: 4978
Epoch: [5]  [ 160/2375]  eta: 0:24:01  loss_objectness: 0.0058 (0.0117)  loss: 0.4551 (0.4803)  loss_rpn_box_reg: 0.0170 (0.0150)  loss_mask: 0.2412 (0.2551)  loss_box_reg: 0.0824 (0.0923)  loss_classifier: 0.0872 (0.1063)  lr: 0.000500  time: 0.6447  data: 0.0080  max mem: 4978
Epoch: [5]  [ 170/2375]  eta: 0:23:54  loss_objectness: 0.0049 (0.0114)  loss: 0.4157 (0.4787)  loss_rpn_box_reg: 0.0150 (0.0149)  loss_mask: 0.2348 (0.2545)  loss_box_reg: 0.0827 (0.0921)  loss_classifier: 0.0796 (0.1058)  lr: 0.000500  time: 0.6465  data: 0.0074  max mem: 4978
Epoch: [5]  [ 180/2375]  eta: 0:23:48  loss_objectness: 0.0045 (0.0110)  loss: 0.4059 (0.4744)  loss_rpn_box_reg: 0.0128 (0.0148)  loss_mask: 0.2206 (0.2528)  loss_box_reg: 0.0765 (0.0910)  loss_classifier: 0.0758 (0.1048)  lr: 0.000500  time: 0.6501  data: 0.0073  max mem: 4978
Epoch: [5]  [ 190/2375]  eta: 0:23:41  loss_objectness: 0.0042 (0.0110)  loss: 0.4165 (0.4739)  loss_rpn_box_reg: 0.0117 (0.0148)  loss_mask: 0.2388 (0.2526)  loss_box_reg: 0.0727 (0.0907)  loss_classifier: 0.0882 (0.1048)  lr: 0.000500  time: 0.6516  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [6]  [   0/2375]  eta: 0:37:00  loss_objectness: 0.0022 (0.0022)  loss: 0.3003 (0.3003)  loss_rpn_box_reg: 0.0106 (0.0106)  loss_mask: 0.1865 (0.1865)  loss_box_reg: 0.0420 (0.0420)  loss_classifier: 0.0590 (0.0590)  lr: 0.000050  time: 0.9350  data: 0.2739  max mem: 4978
Epoch: [6]  [  10/2375]  eta: 0:26:15  loss_objectness: 0.0054 (0.0188)  loss: 0.4820 (0.4814)  loss_rpn_box_reg: 0.0106 (0.0132)  loss_mask: 0.2467 (0.2700)  loss_box_reg: 0.0737 (0.0840)  loss_classifier: 0.1022 (0.0953)  lr: 0.000050  time: 0.6660  data: 0.0319  max mem: 4978
Epoch: [6]  [  20/2375]  eta: 0:25:35  loss_objectness: 0.0054 (0.0151)  loss: 0.4636 (0.4636)  loss_rpn_box_reg: 0.0102 (0.0139)  loss_mask: 0.2370 (0.2537)  loss_box_reg: 0.0761 (0.0815)  loss_classifier: 0.0942 (0.0995)  lr: 0.000050  time: 0.6379  data: 0.0074  max mem: 4978
Epoch: [6]  [  30/2375]  eta: 0:25:22  loss_objectness: 0.0050 (0.0116)  loss: 0.4415 (0.4510)  loss_rpn_box_reg: 0.0108 (0.0133)  loss_mask: 0.2193 (0.2452)  loss_box_reg: 0.0745 (0.0810)  loss_classifier: 0.0942 (0.0999)  lr: 0.000050  time: 0.6403  data: 0.0072  max mem: 4978
Epoch: [6]  [  40/2375]  eta: 0:25:14  loss_objectness: 0.0044 (0.0126)  loss: 0.4148 (0.4463)  loss_rpn_box_reg: 0.0108 (0.0132)  loss_mask: 0.2203 (0.2470)  loss_box_reg: 0.0602 (0.0773)  loss_classifier: 0.0853 (0.0963)  lr: 0.000050  time: 0.6449  data: 0.0071  max mem: 4978
Epoch: [6]  [  50/2375]  eta: 0:25:07  loss_objectness: 0.0047 (0.0116)  loss: 0.4148 (0.4448)  loss_rpn_box_reg: 0.0101 (0.0132)  loss_mask: 0.2345 (0.2458)  loss_box_reg: 0.0737 (0.0776)  loss_classifier: 0.0867 (0.0966)  lr: 0.000050  time: 0.6463  data: 0.0072  max mem: 4978
Epoch: [6]  [  60/2375]  eta: 0:25:00  loss_objectness: 0.0051 (0.0111)  loss: 0.4455 (0.4467)  loss_rpn_box_reg: 0.0110 (0.0131)  loss_mask: 0.2542 (0.2481)  loss_box_reg: 0.0840 (0.0788)  loss_classifier: 0.0889 (0.0956)  lr: 0.000050  time: 0.6472  data: 0.0073  max mem: 4978
Epoch: [6]  [  70/2375]  eta: 0:24:51  loss_objectness: 0.0054 (0.0102)  loss: 0.4109 (0.4454)  loss_rpn_box_reg: 0.0106 (0.0130)  loss_mask: 0.2286 (0.2470)  loss_box_reg: 0.0828 (0.0792)  loss_classifier: 0.0858 (0.0959)  lr: 0.000050  time: 0.6441  data: 0.0073  max mem: 4978
Epoch: [6]  [  80/2375]  eta: 0:24:45  loss_objectness: 0.0047 (0.0111)  loss: 0.4109 (0.4504)  loss_rpn_box_reg: 0.0106 (0.0134)  loss_mask: 0.2286 (0.2476)  loss_box_reg: 0.0748 (0.0800)  loss_classifier: 0.0969 (0.0983)  lr: 0.000050  time: 0.6448  data: 0.0076  max mem: 4978
Epoch: [6]  [  90/2375]  eta: 0:24:39  loss_objectness: 0.0045 (0.0109)  loss: 0.4423 (0.4511)  loss_rpn_box_reg: 0.0126 (0.0133)  loss_mask: 0.2436 (0.2474)  loss_box_reg: 0.0845 (0.0809)  loss_classifier: 0.0983 (0.0985)  lr: 0.000050  time: 0.6499  data: 0.0080  max mem: 4978
Epoch: [6]  [ 100/2375]  eta: 0:24:33  loss_objectness: 0.0045 (0.0108)  loss: 0.4350 (0.4543)  loss_rpn_box_reg: 0.0125 (0.0133)  loss_mask: 0.2462 (0.2498)  loss_box_reg: 0.0798 (0.0812)  loss_classifier: 0.0975 (0.0993)  lr: 0.000050  time: 0.6497  data: 0.0080  max mem: 4978
Epoch: [6]  [ 110/2375]  eta: 0:24:29  loss_objectness: 0.0056 (0.0109)  loss: 0.3965 (0.4539)  loss_rpn_box_reg: 0.0125 (0.0134)  loss_mask: 0.2359 (0.2487)  loss_box_reg: 0.0759 (0.0813)  loss_classifier: 0.0886 (0.0997)  lr: 0.000050  time: 0.6532  data: 0.0076  max mem: 4978
Epoch: [6]  [ 120/2375]  eta: 0:24:22  loss_objectness: 0.0056 (0.0104)  loss: 0.3924 (0.4541)  loss_rpn_box_reg: 0.0130 (0.0133)  loss_mask: 0.2294 (0.2486)  loss_box_reg: 0.0725 (0.0816)  loss_classifier: 0.0886 (0.1001)  lr: 0.000050  time: 0.6530  data: 0.0072  max mem: 4978
Epoch: [6]  [ 130/2375]  eta: 0:24:14  loss_objectness: 0.0046 (0.0103)  loss: 0.4193 (0.4544)  loss_rpn_box_reg: 0.0147 (0.0136)  loss_mask: 0.2542 (0.2505)  loss_box_reg: 0.0704 (0.0810)  loss_classifier: 0.0907 (0.0991)  lr: 0.000050  time: 0.6443  data: 0.0071  max mem: 4978
Epoch: [6]  [ 140/2375]  eta: 0:24:08  loss_objectness: 0.0045 (0.0099)  loss: 0.4067 (0.4522)  loss_rpn_box_reg: 0.0133 (0.0135)  loss_mask: 0.2484 (0.2495)  loss_box_reg: 0.0757 (0.0809)  loss_classifier: 0.0884 (0.0984)  lr: 0.000050  time: 0.6439  data: 0.0071  max mem: 4978
Epoch: [6]  [ 150/2375]  eta: 0:24:00  loss_objectness: 0.0045 (0.0101)  loss: 0.4183 (0.4534)  loss_rpn_box_reg: 0.0130 (0.0137)  loss_mask: 0.2484 (0.2502)  loss_box_reg: 0.0749 (0.0810)  loss_classifier: 0.0899 (0.0985)  lr: 0.000050  time: 0.6447  data: 0.0075  max mem: 4978
Epoch: [6]  [ 160/2375]  eta: 0:23:54  loss_objectness: 0.0034 (0.0098)  loss: 0.4509 (0.4530)  loss_rpn_box_reg: 0.0130 (0.0138)  loss_mask: 0.2362 (0.2490)  loss_box_reg: 0.0749 (0.0812)  loss_classifier: 0.1059 (0.0991)  lr: 0.000050  time: 0.6466  data: 0.0076  max mem: 4978
Epoch: [6]  [ 170/2375]  eta: 0:23:47  loss_objectness: 0.0032 (0.0096)  loss: 0.4583 (0.4561)  loss_rpn_box_reg: 0.0122 (0.0137)  loss_mask: 0.2309 (0.2500)  loss_box_reg: 0.0919 (0.0826)  loss_classifier: 0.1077 (0.1000)  lr: 0.000050  time: 0.6474  data: 0.0073  max mem: 4978
Epoch: [6]  [ 180/2375]  eta: 0:23:41  loss_objectness: 0.0031 (0.0094)  loss: 0.4477 (0.4537)  loss_rpn_box_reg: 0.0137 (0.0139)  loss_mask: 0.2407 (0.2489)  loss_box_reg: 0.0868 (0.0822)  loss_classifier: 0.0910 (0.0993)  lr: 0.000050  time: 0.6451  data: 0.0077  max mem: 4978
Epoch: [6]  [ 190/2375]  eta: 0:23:33  loss_objectness: 0.0068 (0.0102)  loss: 0.4520 (0.4584)  loss_rpn_box_reg: 0.0146 (0.0141)  loss_mask: 0.2417 (0.2501)  loss_box_reg: 0.0778 (0.0830)  loss_classifier: 0.1081 (0.1010)  lr: 0.000050  time: 0.6419  data: 0.0076  max mem: 4978
Epoch: [6]  [ 200/2375]  eta: 0:23:26  loss_objectness: 0.0052 (0.0100)  loss: 0.4520 (0.4570)  loss_rpn_box_reg: 0.0131 (0.0140)  loss_mask: 0.2672 (0.2499)  loss_box_reg: 0.0775 (0.0824)  loss_classifier: 0.0868 (0.1006)  lr: 0.000050  time: 0.6403  data: 0.0073  max mem: 4978
Epoch: [6]  [ 210/2375]  eta: 0:23:20  loss_objectness: 0.0046 (0.0099)  loss: 0.4494 (0.4593)  loss_rpn_box_reg: 0.0111 (0.0139)  loss_mask: 0.2526 (0.2508)  loss_box_reg: 0.0828 (0.0834)  loss_classifier: 0.1006 (0.1013)  lr: 0.000050  time: 0.6455  data: 0.0073  max mem: 4978
Epoch: [6]  [ 220/2375]  eta: 0:23:13  loss_objectness: 0.0061 (0.0101)  loss: 0.4717 (0.4577)  loss_rpn_box_reg: 0.0111 (0.0139)  loss_mask: 0.2373 (0.2496)  loss_box_reg: 0.0964 (0.0833)  loss_classifier: 0.1026 (0.1007)  lr: 0.000050  time: 0.6473  data: 0.0073  max mem: 4978
Epoch: [6]  [ 230/2375]  eta: 0:23:07  loss_objectness: 0.0039 (0.0099)  loss: 0.4212 (0.4566)  loss_rpn_box_reg: 0.0105 (0.0138)  loss_mask: 0.2267 (0.2493)  loss_box_reg: 0.0761 (0.0831)  loss_classifier: 0.0809 (0.1005)  lr: 0.000050  time: 0.6450  data: 0.0072  max mem: 4978
Epoch: [6]  [ 240/2375]  eta: 0:23:00  loss_objectness: 0.0022 (0.0097)  loss: 0.4212 (0.4546)  loss_rpn_box_reg: 0.0093 (0.0137)  loss_mask: 0.2316 (0.2493)  loss_box_reg: 0.0556 (0.0822)  loss_classifier: 0.0779 (0.0997)  lr: 0.000050  time: 0.6464  data: 0.0071  max mem: 4978
Epoch: [6]  [ 250/2375]  eta: 0:22:54  loss_objectness: 0.0032 (0.0097)  loss: 0.3744 (0.4521)  loss_rpn_box_reg: 0.0086 (0.0136)  loss_mask: 0.2265 (0.2486)  loss_box_reg: 0.0536 (0.0815)  loss_classifier: 0.0779 (0.0988)  lr: 0.000050  time: 0.6511  data: 0.0076  max mem: 4978
Epoch: [6]  [ 260/2375]  eta: 0:22:48  loss_objectness: 0.0032 (0.0095)  loss: 0.4198 (0.4520)  loss_rpn_box_reg: 0.0095 (0.0136)  loss_mask: 0.2255 (0.2485)  loss_box_reg: 0.0720 (0.0817)  loss_classifier: 0.0789 (0.0986)  lr: 0.000050  time: 0.6495  data: 0.0078  max mem: 4978
Epoch: [6]  [ 270/2375]  eta: 0:22:42  loss_objectness: 0.0029 (0.0094)  loss: 0.4398 (0.4525)  loss_rpn_box_reg: 0.0110 (0.0136)  loss_mask: 0.2410 (0.2488)  loss_box_reg: 0.0749 (0.0818)  loss_classifier: 0.0863 (0.0990)  lr: 0.000050  time: 0.6500  data: 0.0074  max mem: 4978
Epoch: [6]  [ 280/2375]  eta: 0:22:35  loss_objectness: 0.0032 (0.0092)  loss: 0.4528 (0.4518)  loss_rpn_box_reg: 0.0096 (0.0135)  loss_mask: 0.2473 (0.2488)  loss_box_reg: 0.0728 (0.0816)  loss_classifier: 0.0863 (0.0987)  lr: 0.000050  time: 0.6472  data: 0.0073  max mem: 4978
Epoch: [6]  [ 290/2375]  eta: 0:22:28  loss_objectness: 0.0033 (0.0092)  loss: 0.4419 (0.4512)  loss_rpn_box_reg: 0.0097 (0.0135)  loss_mask: 0.2350 (0.2484)  loss_box_reg: 0.0746 (0.0813)  loss_classifier: 0.0891 (0.0987)  lr: 0.000050  time: 0.6412  data: 0.0072  max mem: 4978
Epoch: [6]  [ 300/2375]  eta: 0:22:21  loss_objectness: 0.0041 (0.0092)  loss: 0.4009 (0.4491)  loss_rpn_box_reg: 0.0118 (0.0136)  loss_mask: 0.2238 (0.2472)  loss_box_reg: 0.0598 (0.0809)  loss_classifier: 0.0853 (0.0982)  lr: 0.000050  time: 0.6408  data: 0.0072  max mem: 4978
Epoch: [6]  [ 310/2375]  eta: 0:22:15  loss_objectness: 0.0037 (0.0091)  loss: 0.4146 (0.4491)  loss_rpn_box_reg: 0.0118 (0.0135)  loss_mask: 0.2361 (0.2476)  loss_box_reg: 0.0626 (0.0810)  loss_classifier: 0.0826 (0.0979)  lr: 0.000050  time: 0.6456  data: 0.0073  max mem: 4978
Epoch: [6]  [ 320/2375]  eta: 0:22:09  loss_objectness: 0.0037 (0.0089)  loss: 0.4239 (0.4475)  loss_rpn_box_reg: 0.0108 (0.0134)  loss_mask: 0.2361 (0.2468)  loss_box_reg: 0.0711 (0.0808)  loss_classifier: 0.0860 (0.0975)  lr: 0.000050  time: 0.6487  data: 0.0073  max mem: 4978
Epoch: [6]  [ 330/2375]  eta: 0:22:02  loss_objectness: 0.0050 (0.0090)  loss: 0.4204 (0.4480)  loss_rpn_box_reg: 0.0111 (0.0134)  loss_mask: 0.2324 (0.2469)  loss_box_reg: 0.0745 (0.0810)  loss_classifier: 0.0942 (0.0977)  lr: 0.000050  time: 0.6476  data: 0.0073  max mem: 4978
Epoch: [6]  [ 340/2375]  eta: 0:21:56  loss_objectness: 0.0045 (0.0089)  loss: 0.4279 (0.4477)  loss_rpn_box_reg: 0.0111 (0.0134)  loss_mask: 0.2481 (0.2469)  loss_box_reg: 0.0772 (0.0810)  loss_classifier: 0.0990 (0.0976)  lr: 0.000050  time: 0.6477  data: 0.0073  max mem: 4978
some issue here. skipping.
Epoch: [7]  [   0/2375]  eta: 0:34:54  loss_objectness: 0.0052 (0.0052)  loss: 0.3620 (0.3620)  loss_rpn_box_reg: 0.0416 (0.0416)  loss_mask: 0.1942 (0.1942)  loss_box_reg: 0.0613 (0.0613)  loss_classifier: 0.0596 (0.0596)  lr: 0.000050  time: 0.8821  data: 0.2247  max mem: 4978
Epoch: [7]  [  10/2375]  eta: 0:26:20  loss_objectness: 0.0029 (0.0060)  loss: 0.4169 (0.4502)  loss_rpn_box_reg: 0.0154 (0.0155)  loss_mask: 0.2454 (0.2546)  loss_box_reg: 0.0655 (0.0733)  loss_classifier: 0.1007 (0.1007)  lr: 0.000050  time: 0.6683  data: 0.0272  max mem: 4978
Epoch: [7]  [  20/2375]  eta: 0:25:56  loss_objectness: 0.0040 (0.0107)  loss: 0.4169 (0.4551)  loss_rpn_box_reg: 0.0122 (0.0158)  loss_mask: 0.2372 (0.2488)  loss_box_reg: 0.0688 (0.0800)  loss_classifier: 0.0983 (0.0997)  lr: 0.000050  time: 0.6499  data: 0.0075  max mem: 4978
Epoch: [7]  [  30/2375]  eta: 0:25:44  loss_objectness: 0.0042 (0.0099)  loss: 0.4353 (0.4540)  loss_rpn_box_reg: 0.0099 (0.0146)  loss_mask: 0.2400 (0.2514)  loss_box_reg: 0.0767 (0.0793)  loss_classifier: 0.0901 (0.0988)  lr: 0.000050  time: 0.6531  data: 0.0073  max mem: 4978
Epoch: [7]  [  40/2375]  eta: 0:25:28  loss_objectness: 0.0064 (0.0095)  loss: 0.4353 (0.4548)  loss_rpn_box_reg: 0.0099 (0.0135)  loss_mask: 0.2421 (0.2500)  loss_box_reg: 0.0769 (0.0809)  loss_classifier: 0.0911 (0.1009)  lr: 0.000050  time: 0.6480  data: 0.0076  max mem: 4978
Epoch: [7]  [  50/2375]  eta: 0:25:16  loss_objectness: 0.0066 (0.0097)  loss: 0.4513 (0.4533)  loss_rpn_box_reg: 0.0092 (0.0156)  loss_mask: 0.2407 (0.2488)  loss_box_reg: 0.0757 (0.0806)  loss_classifier: 0.0925 (0.0986)  lr: 0.000050  time: 0.6424  data: 0.0076  max mem: 4978
Epoch: [7]  [  60/2375]  eta: 0:25:05  loss_objectness: 0.0050 (0.0091)  loss: 0.4471 (0.4572)  loss_rpn_box_reg: 0.0113 (0.0157)  loss_mask: 0.2439 (0.2510)  loss_box_reg: 0.0770 (0.0820)  loss_classifier: 0.0851 (0.0994)  lr: 0.000050  time: 0.6416  data: 0.0073  max mem: 4978
Epoch: [7]  [  70/2375]  eta: 0:25:01  loss_objectness: 0.0046 (0.0088)  loss: 0.4423 (0.4522)  loss_rpn_box_reg: 0.0120 (0.0154)  loss_mask: 0.2328 (0.2483)  loss_box_reg: 0.0785 (0.0813)  loss_classifier: 0.0833 (0.0984)  lr: 0.000050  time: 0.6492  data: 0.0072  max mem: 4978
Epoch: [7]  [  80/2375]  eta: 0:24:52  loss_objectness: 0.0032 (0.0089)  loss: 0.4330 (0.4519)  loss_rpn_box_reg: 0.0118 (0.0153)  loss_mask: 0.2275 (0.2463)  loss_box_reg: 0.0803 (0.0815)  loss_classifier: 0.1010 (0.0999)  lr: 0.000050  time: 0.6508  data: 0.0072  max mem: 4978
Epoch: [7]  [  90/2375]  eta: 0:24:43  loss_objectness: 0.0030 (0.0094)  loss: 0.4186 (0.4515)  loss_rpn_box_reg: 0.0107 (0.0151)  loss_mask: 0.2277 (0.2460)  loss_box_reg: 0.0777 (0.0815)  loss_classifier: 0.0999 (0.0994)  lr: 0.000050  time: 0.6421  data: 0.0072  max mem: 4978
Epoch: [7]  [ 100/2375]  eta: 0:24:37  loss_objectness: 0.0048 (0.0090)  loss: 0.4205 (0.4519)  loss_rpn_box_reg: 0.0110 (0.0148)  loss_mask: 0.2457 (0.2463)  loss_box_reg: 0.0830 (0.0822)  loss_classifier: 0.0873 (0.0995)  lr: 0.000050  time: 0.6455  data: 0.0072  max mem: 4978
Epoch: [7]  [ 110/2375]  eta: 0:24:31  loss_objectness: 0.0055 (0.0087)  loss: 0.4125 (0.4470)  loss_rpn_box_reg: 0.0100 (0.0146)  loss_mask: 0.2401 (0.2453)  loss_box_reg: 0.0779 (0.0809)  loss_classifier: 0.0774 (0.0976)  lr: 0.000050  time: 0.6518  data: 0.0072  max mem: 4978
Epoch: [7]  [ 120/2375]  eta: 0:24:25  loss_objectness: 0.0055 (0.0085)  loss: 0.3984 (0.4476)  loss_rpn_box_reg: 0.0095 (0.0142)  loss_mask: 0.2314 (0.2463)  loss_box_reg: 0.0700 (0.0810)  loss_classifier: 0.0753 (0.0976)  lr: 0.000050  time: 0.6518  data: 0.0076  max mem: 4978
Epoch: [7]  [ 130/2375]  eta: 0:24:18  loss_objectness: 0.0041 (0.0082)  loss: 0.3977 (0.4451)  loss_rpn_box_reg: 0.0103 (0.0140)  loss_mask: 0.2332 (0.2460)  loss_box_reg: 0.0721 (0.0802)  loss_classifier: 0.0913 (0.0967)  lr: 0.000050  time: 0.6485  data: 0.0076  max mem: 4978
Epoch: [7]  [ 140/2375]  eta: 0:24:13  loss_objectness: 0.0039 (0.0081)  loss: 0.4040 (0.4449)  loss_rpn_box_reg: 0.0130 (0.0141)  loss_mask: 0.2256 (0.2451)  loss_box_reg: 0.0784 (0.0806)  loss_classifier: 0.0928 (0.0970)  lr: 0.000050  time: 0.6534  data: 0.0073  max mem: 4978
Epoch: [7]  [ 150/2375]  eta: 0:24:06  loss_objectness: 0.0054 (0.0083)  loss: 0.4247 (0.4464)  loss_rpn_box_reg: 0.0127 (0.0140)  loss_mask: 0.2286 (0.2461)  loss_box_reg: 0.0804 (0.0807)  loss_classifier: 0.0873 (0.0972)  lr: 0.000050  time: 0.6551  data: 0.0077  max mem: 4978
Epoch: [7]  [ 160/2375]  eta: 0:24:00  loss_objectness: 0.0035 (0.0080)  loss: 0.4235 (0.4449)  loss_rpn_box_reg: 0.0100 (0.0138)  loss_mask: 0.2410 (0.2457)  loss_box_reg: 0.0743 (0.0807)  loss_classifier: 0.0870 (0.0967)  lr: 0.000050  time: 0.6500  data: 0.0076  max mem: 4978
Epoch: [7]  [ 170/2375]  eta: 0:23:53  loss_objectness: 0.0035 (0.0078)  loss: 0.4235 (0.4418)  loss_rpn_box_reg: 0.0081 (0.0136)  loss_mask: 0.2241 (0.2444)  loss_box_reg: 0.0725 (0.0799)  loss_classifier: 0.0937 (0.0960)  lr: 0.000050  time: 0.6473  data: 0.0072  max mem: 4978
Epoch: [7]  [ 180/2375]  eta: 0:23:47  loss_objectness: 0.0038 (0.0079)  loss: 0.4337 (0.4430)  loss_rpn_box_reg: 0.0081 (0.0138)  loss_mask: 0.2316 (0.2443)  loss_box_reg: 0.0728 (0.0801)  loss_classifier: 0.0960 (0.0969)  lr: 0.000050  time: 0.6496  data: 0.0076  max mem: 4978
Epoch: [7]  [ 190/2375]  eta: 0:23:40  loss_objectness: 0.0057 (0.0081)  loss: 0.4736 (0.4467)  loss_rpn_box_reg: 0.0111 (0.0137)  loss_mask: 0.2519 (0.2455)  loss_box_reg: 0.0781 (0.0810)  loss_classifier: 0.1282 (0.0983)  lr: 0.000050  time: 0.6499  data: 0.0080  max mem: 4978
Epoch: [7]  [ 200/2375]  eta: 0:23:33  loss_objectness: 0.0066 (0.0082)  loss: 0.4089 (0.4444)  loss_rpn_box_reg: 0.0108 (0.0137)  loss_mask: 0.2498 (0.2442)  loss_box_reg: 0.0618 (0.0803)  loss_classifier: 0.0899 (0.0981)  lr: 0.000050  time: 0.6463  data: 0.0076  max mem: 4978
Epoch: [7]  [ 210/2375]  eta: 0:23:27  loss_objectness: 0.0036 (0.0080)  loss: 0.4089 (0.4444)  loss_rpn_box_reg: 0.0121 (0.0138)  loss_mask: 0.2086 (0.2440)  loss_box_reg: 0.0674 (0.0805)  loss_classifier: 0.0899 (0.0981)  lr: 0.000050  time: 0.6493  data: 0.0073  max mem: 4978
Epoch: [7]  [ 220/2375]  eta: 0:23:20  loss_objectness: 0.0038 (0.0080)  loss: 0.4723 (0.4459)  loss_rpn_box_reg: 0.0141 (0.0138)  loss_mask: 0.2361 (0.2442)  loss_box_reg: 0.0883 (0.0811)  loss_classifier: 0.1014 (0.0989)  lr: 0.000050  time: 0.6519  data: 0.0076  max mem: 4978
Epoch: [7]  [ 230/2375]  eta: 0:23:13  loss_objectness: 0.0048 (0.0082)  loss: 0.4723 (0.4472)  loss_rpn_box_reg: 0.0128 (0.0139)  loss_mask: 0.2497 (0.2442)  loss_box_reg: 0.0883 (0.0815)  loss_classifier: 0.1008 (0.0995)  lr: 0.000050  time: 0.6498  data: 0.0076  max mem: 4978
Epoch: [7]  [ 240/2375]  eta: 0:23:07  loss_objectness: 0.0043 (0.0081)  loss: 0.4341 (0.4472)  loss_rpn_box_reg: 0.0123 (0.0138)  loss_mask: 0.2273 (0.2438)  loss_box_reg: 0.0825 (0.0818)  loss_classifier: 0.0930 (0.0996)  lr: 0.000050  time: 0.6509  data: 0.0074  max mem: 4978
Epoch: [7]  [ 250/2375]  eta: 0:23:01  loss_objectness: 0.0036 (0.0080)  loss: 0.4412 (0.4482)  loss_rpn_box_reg: 0.0122 (0.0138)  loss_mask: 0.2361 (0.2443)  loss_box_reg: 0.0806 (0.0820)  loss_classifier: 0.1048 (0.1000)  lr: 0.000050  time: 0.6534  data: 0.0078  max mem: 4978
Epoch: [7]  [ 260/2375]  eta: 0:22:55  loss_objectness: 0.0031 (0.0083)  loss: 0.4546 (0.4484)  loss_rpn_box_reg: 0.0116 (0.0138)  loss_mask: 0.2506 (0.2448)  loss_box_reg: 0.0716 (0.0817)  loss_classifier: 0.1009 (0.0998)  lr: 0.000050  time: 0.6535  data: 0.0076  max mem: 4978
Epoch: [7]  [ 270/2375]  eta: 0:22:48  loss_objectness: 0.0026 (0.0083)  loss: 0.4398 (0.4480)  loss_rpn_box_reg: 0.0114 (0.0138)  loss_mask: 0.2506 (0.2442)  loss_box_reg: 0.0716 (0.0818)  loss_classifier: 0.0923 (0.1000)  lr: 0.000050  time: 0.6519  data: 0.0072  max mem: 4978
Epoch: [7]  [ 280/2375]  eta: 0:22:41  loss_objectness: 0.0047 (0.0083)  loss: 0.4722 (0.4506)  loss_rpn_box_reg: 0.0127 (0.0138)  loss_mask: 0.2232 (0.2449)  loss_box_reg: 0.0922 (0.0828)  loss_classifier: 0.1139 (0.1007)  lr: 0.000050  time: 0.6468  data: 0.0073  max mem: 4978
Epoch: [7]  [ 290/2375]  eta: 0:22:34  loss_objectness: 0.0086 (0.0085)  loss: 0.4722 (0.4508)  loss_rpn_box_reg: 0.0114 (0.0139)  loss_mask: 0.2325 (0.2446)  loss_box_reg: 0.0880 (0.0827)  loss_classifier: 0.1139 (0.1011)  lr: 0.000050  time: 0.6438  data: 0.0076  max mem: 4978
Epoch: [7]  [ 300/2375]  eta: 0:22:29  loss_objectness: 0.0040 (0.0083)  loss: 0.4340 (0.4506)  loss_rpn_box_reg: 0.0107 (0.0139)  loss_mask: 0.2402 (0.2447)  loss_box_reg: 0.0693 (0.0829)  loss_classifier: 0.0928 (0.1008)  lr: 0.000050  time: 0.6510  data: 0.0078  max mem: 4978
Epoch: [7]  [ 310/2375]  eta: 0:22:22  loss_objectness: 0.0040 (0.0083)  loss: 0.4500 (0.4517)  loss_rpn_box_reg: 0.0111 (0.0138)  loss_mask: 0.2450 (0.2455)  loss_box_reg: 0.0746 (0.0829)  loss_classifier: 0.0978 (0.1012)  lr: 0.000050  time: 0.6519  data: 0.0075  max mem: 4978
some issue here. skipping.
Epoch: [8]  [   0/2375]  eta: 0:36:53  loss_objectness: 0.0011 (0.0011)  loss: 0.4124 (0.4124)  loss_rpn_box_reg: 0.0129 (0.0129)  loss_mask: 0.2453 (0.2453)  loss_box_reg: 0.0756 (0.0756)  loss_classifier: 0.0776 (0.0776)  lr: 0.000050  time: 0.9321  data: 0.2727  max mem: 4978
Epoch: [8]  [  10/2375]  eta: 0:26:16  loss_objectness: 0.0050 (0.0073)  loss: 0.4008 (0.4364)  loss_rpn_box_reg: 0.0105 (0.0128)  loss_mask: 0.2341 (0.2465)  loss_box_reg: 0.0739 (0.0736)  loss_classifier: 0.0800 (0.0961)  lr: 0.000050  time: 0.6666  data: 0.0317  max mem: 4978
Epoch: [8]  [  20/2375]  eta: 0:25:49  loss_objectness: 0.0067 (0.0077)  loss: 0.4008 (0.4358)  loss_rpn_box_reg: 0.0102 (0.0130)  loss_mask: 0.2292 (0.2446)  loss_box_reg: 0.0739 (0.0752)  loss_classifier: 0.0877 (0.0953)  lr: 0.000050  time: 0.6444  data: 0.0076  max mem: 4978
Epoch: [8]  [  30/2375]  eta: 0:25:47  loss_objectness: 0.0053 (0.0071)  loss: 0.4228 (0.4272)  loss_rpn_box_reg: 0.0108 (0.0122)  loss_mask: 0.2282 (0.2392)  loss_box_reg: 0.0738 (0.0757)  loss_classifier: 0.0855 (0.0930)  lr: 0.000050  time: 0.6560  data: 0.0077  max mem: 4978
Epoch: [8]  [  40/2375]  eta: 0:25:29  loss_objectness: 0.0033 (0.0061)  loss: 0.4074 (0.4240)  loss_rpn_box_reg: 0.0103 (0.0115)  loss_mask: 0.2341 (0.2425)  loss_box_reg: 0.0717 (0.0729)  loss_classifier: 0.0775 (0.0911)  lr: 0.000050  time: 0.6516  data: 0.0075  max mem: 4978
Epoch: [8]  [  50/2375]  eta: 0:25:15  loss_objectness: 0.0021 (0.0074)  loss: 0.4006 (0.4250)  loss_rpn_box_reg: 0.0099 (0.0115)  loss_mask: 0.2277 (0.2410)  loss_box_reg: 0.0671 (0.0732)  loss_classifier: 0.0729 (0.0919)  lr: 0.000050  time: 0.6392  data: 0.0072  max mem: 4978
Epoch: [8]  [  60/2375]  eta: 0:25:04  loss_objectness: 0.0040 (0.0082)  loss: 0.3917 (0.4253)  loss_rpn_box_reg: 0.0091 (0.0121)  loss_mask: 0.2275 (0.2406)  loss_box_reg: 0.0659 (0.0723)  loss_classifier: 0.0771 (0.0921)  lr: 0.000050  time: 0.6390  data: 0.0073  max mem: 4978
Epoch: [8]  [  70/2375]  eta: 0:24:56  loss_objectness: 0.0028 (0.0076)  loss: 0.3896 (0.4257)  loss_rpn_box_reg: 0.0101 (0.0123)  loss_mask: 0.2299 (0.2403)  loss_box_reg: 0.0703 (0.0735)  loss_classifier: 0.0793 (0.0920)  lr: 0.000050  time: 0.6429  data: 0.0074  max mem: 4978
Epoch: [8]  [  80/2375]  eta: 0:24:46  loss_objectness: 0.0026 (0.0077)  loss: 0.3751 (0.4274)  loss_rpn_box_reg: 0.0116 (0.0122)  loss_mask: 0.2356 (0.2405)  loss_box_reg: 0.0723 (0.0737)  loss_classifier: 0.0852 (0.0933)  lr: 0.000050  time: 0.6422  data: 0.0073  max mem: 4978
Epoch: [8]  [  90/2375]  eta: 0:24:38  loss_objectness: 0.0041 (0.0079)  loss: 0.3634 (0.4253)  loss_rpn_box_reg: 0.0101 (0.0120)  loss_mask: 0.2100 (0.2384)  loss_box_reg: 0.0491 (0.0735)  loss_classifier: 0.0736 (0.0936)  lr: 0.000050  time: 0.6398  data: 0.0076  max mem: 4978
Epoch: [8]  [ 100/2375]  eta: 0:24:33  loss_objectness: 0.0036 (0.0077)  loss: 0.3900 (0.4309)  loss_rpn_box_reg: 0.0086 (0.0122)  loss_mask: 0.2483 (0.2420)  loss_box_reg: 0.0518 (0.0741)  loss_classifier: 0.0798 (0.0949)  lr: 0.000050  time: 0.6467  data: 0.0076  max mem: 4978
Epoch: [8]  [ 110/2375]  eta: 0:24:25  loss_objectness: 0.0035 (0.0076)  loss: 0.4054 (0.4288)  loss_rpn_box_reg: 0.0103 (0.0121)  loss_mask: 0.2327 (0.2405)  loss_box_reg: 0.0674 (0.0737)  loss_classifier: 0.0853 (0.0948)  lr: 0.000050  time: 0.6465  data: 0.0074  max mem: 4978
Epoch: [8]  [ 120/2375]  eta: 0:24:18  loss_objectness: 0.0031 (0.0073)  loss: 0.3790 (0.4245)  loss_rpn_box_reg: 0.0103 (0.0127)  loss_mask: 0.2110 (0.2380)  loss_box_reg: 0.0653 (0.0730)  loss_classifier: 0.0805 (0.0935)  lr: 0.000050  time: 0.6424  data: 0.0074  max mem: 4978
Epoch: [8]  [ 130/2375]  eta: 0:24:11  loss_objectness: 0.0034 (0.0070)  loss: 0.3790 (0.4219)  loss_rpn_box_reg: 0.0102 (0.0125)  loss_mask: 0.2110 (0.2371)  loss_box_reg: 0.0541 (0.0727)  loss_classifier: 0.0723 (0.0926)  lr: 0.000050  time: 0.6443  data: 0.0076  max mem: 4978
Epoch: [8]  [ 140/2375]  eta: 0:24:06  loss_objectness: 0.0027 (0.0067)  loss: 0.3510 (0.4158)  loss_rpn_box_reg: 0.0071 (0.0122)  loss_mask: 0.2233 (0.2350)  loss_box_reg: 0.0492 (0.0710)  loss_classifier: 0.0628 (0.0910)  lr: 0.000050  time: 0.6511  data: 0.0077  max mem: 4978
Epoch: [8]  [ 150/2375]  eta: 0:24:01  loss_objectness: 0.0037 (0.0070)  loss: 0.3871 (0.4209)  loss_rpn_box_reg: 0.0115 (0.0126)  loss_mask: 0.2251 (0.2357)  loss_box_reg: 0.0753 (0.0730)  loss_classifier: 0.0855 (0.0926)  lr: 0.000050  time: 0.6568  data: 0.0075  max mem: 4978
some issue here. skipping.
Epoch: [9]  [   0/2375]  eta: 0:38:37  loss_objectness: 0.0080 (0.0080)  loss: 0.4348 (0.4348)  loss_rpn_box_reg: 0.0109 (0.0109)  loss_mask: 0.2511 (0.2511)  loss_box_reg: 0.0575 (0.0575)  loss_classifier: 0.1073 (0.1073)  lr: 0.000005  time: 0.9756  data: 0.3148  max mem: 4978
Epoch: [9]  [  10/2375]  eta: 0:27:03  loss_objectness: 0.0040 (0.0049)  loss: 0.4024 (0.4049)  loss_rpn_box_reg: 0.0109 (0.0110)  loss_mask: 0.2307 (0.2258)  loss_box_reg: 0.0712 (0.0720)  loss_classifier: 0.0986 (0.0912)  lr: 0.000005  time: 0.6866  data: 0.0356  max mem: 4978
Epoch: [9]  [  20/2375]  eta: 0:26:18  loss_objectness: 0.0060 (0.0082)  loss: 0.4307 (0.4417)  loss_rpn_box_reg: 0.0113 (0.0121)  loss_mask: 0.2438 (0.2398)  loss_box_reg: 0.0787 (0.0811)  loss_classifier: 0.1020 (0.1006)  lr: 0.000005  time: 0.6551  data: 0.0077  max mem: 4978
Epoch: [9]  [  30/2375]  eta: 0:25:54  loss_objectness: 0.0057 (0.0116)  loss: 0.4135 (0.4116)  loss_rpn_box_reg: 0.0105 (0.0123)  loss_mask: 0.2290 (0.2287)  loss_box_reg: 0.0753 (0.0700)  loss_classifier: 0.0810 (0.0890)  lr: 0.000005  time: 0.6496  data: 0.0076  max mem: 4978
Epoch: [9]  [  40/2375]  eta: 0:25:40  loss_objectness: 0.0029 (0.0110)  loss: 0.3889 (0.4203)  loss_rpn_box_reg: 0.0104 (0.0123)  loss_mask: 0.2187 (0.2348)  loss_box_reg: 0.0569 (0.0710)  loss_classifier: 0.0728 (0.0912)  lr: 0.000005  time: 0.6490  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [10]  [   0/2375]  eta: 0:38:47  loss_objectness: 0.0043 (0.0043)  loss: 0.4533 (0.4533)  loss_rpn_box_reg: 0.0326 (0.0326)  loss_mask: 0.2447 (0.2447)  loss_box_reg: 0.0853 (0.0853)  loss_classifier: 0.0864 (0.0864)  lr: 0.000005  time: 0.9800  data: 0.2974  max mem: 4978
Epoch: [10]  [  10/2375]  eta: 0:26:37  loss_objectness: 0.0029 (0.0038)  loss: 0.3233 (0.3758)  loss_rpn_box_reg: 0.0129 (0.0132)  loss_mask: 0.2165 (0.2290)  loss_box_reg: 0.0614 (0.0586)  loss_classifier: 0.0585 (0.0712)  lr: 0.000005  time: 0.6756  data: 0.0340  max mem: 4978
Epoch: [10]  [  20/2375]  eta: 0:26:10  loss_objectness: 0.0020 (0.0040)  loss: 0.3687 (0.3973)  loss_rpn_box_reg: 0.0118 (0.0130)  loss_mask: 0.2165 (0.2325)  loss_box_reg: 0.0638 (0.0672)  loss_classifier: 0.0728 (0.0806)  lr: 0.000005  time: 0.6514  data: 0.0075  max mem: 4978
Epoch: [10]  [  30/2375]  eta: 0:25:48  loss_objectness: 0.0025 (0.0062)  loss: 0.4403 (0.4182)  loss_rpn_box_reg: 0.0118 (0.0144)  loss_mask: 0.2328 (0.2363)  loss_box_reg: 0.0756 (0.0740)  loss_classifier: 0.0878 (0.0873)  lr: 0.000005  time: 0.6521  data: 0.0073  max mem: 4978
Epoch: [10]  [  40/2375]  eta: 0:25:38  loss_objectness: 0.0047 (0.0063)  loss: 0.4701 (0.4221)  loss_rpn_box_reg: 0.0101 (0.0141)  loss_mask: 0.2414 (0.2372)  loss_box_reg: 0.0727 (0.0743)  loss_classifier: 0.0967 (0.0902)  lr: 0.000005  time: 0.6501  data: 0.0074  max mem: 4978
Epoch: [10]  [  50/2375]  eta: 0:25:27  loss_objectness: 0.0038 (0.0058)  loss: 0.4255 (0.4307)  loss_rpn_box_reg: 0.0110 (0.0140)  loss_mask: 0.2489 (0.2420)  loss_box_reg: 0.0727 (0.0759)  loss_classifier: 0.0927 (0.0930)  lr: 0.000005  time: 0.6512  data: 0.0072  max mem: 4978
Epoch: [10]  [  60/2375]  eta: 0:25:22  loss_objectness: 0.0043 (0.0069)  loss: 0.4255 (0.4347)  loss_rpn_box_reg: 0.0115 (0.0141)  loss_mask: 0.2362 (0.2414)  loss_box_reg: 0.0716 (0.0774)  loss_classifier: 0.0915 (0.0949)  lr: 0.000005  time: 0.6558  data: 0.0072  max mem: 4978
Epoch: [10]  [  70/2375]  eta: 0:25:12  loss_objectness: 0.0045 (0.0073)  loss: 0.4017 (0.4339)  loss_rpn_box_reg: 0.0115 (0.0139)  loss_mask: 0.2086 (0.2406)  loss_box_reg: 0.0716 (0.0767)  loss_classifier: 0.0870 (0.0955)  lr: 0.000005  time: 0.6550  data: 0.0075  max mem: 4978
Epoch: [10]  [  80/2375]  eta: 0:25:03  loss_objectness: 0.0040 (0.0072)  loss: 0.4104 (0.4379)  loss_rpn_box_reg: 0.0105 (0.0138)  loss_mask: 0.2398 (0.2421)  loss_box_reg: 0.0759 (0.0780)  loss_classifier: 0.0847 (0.0967)  lr: 0.000005  time: 0.6471  data: 0.0074  max mem: 4978
Epoch: [10]  [  90/2375]  eta: 0:24:58  loss_objectness: 0.0040 (0.0072)  loss: 0.4239 (0.4411)  loss_rpn_box_reg: 0.0128 (0.0142)  loss_mask: 0.2438 (0.2431)  loss_box_reg: 0.0768 (0.0791)  loss_classifier: 0.0941 (0.0975)  lr: 0.000005  time: 0.6534  data: 0.0074  max mem: 4978
Epoch: [10]  [ 100/2375]  eta: 0:24:49  loss_objectness: 0.0045 (0.0076)  loss: 0.4199 (0.4393)  loss_rpn_box_reg: 0.0128 (0.0143)  loss_mask: 0.2417 (0.2424)  loss_box_reg: 0.0737 (0.0778)  loss_classifier: 0.0941 (0.0971)  lr: 0.000005  time: 0.6521  data: 0.0074  max mem: 4978
Epoch: [10]  [ 110/2375]  eta: 0:24:40  loss_objectness: 0.0048 (0.0074)  loss: 0.4056 (0.4363)  loss_rpn_box_reg: 0.0106 (0.0142)  loss_mask: 0.2196 (0.2403)  loss_box_reg: 0.0666 (0.0784)  loss_classifier: 0.0756 (0.0960)  lr: 0.000005  time: 0.6451  data: 0.0072  max mem: 4978
Epoch: [10]  [ 120/2375]  eta: 0:24:33  loss_objectness: 0.0037 (0.0075)  loss: 0.4198 (0.4370)  loss_rpn_box_reg: 0.0122 (0.0143)  loss_mask: 0.2214 (0.2400)  loss_box_reg: 0.0714 (0.0788)  loss_classifier: 0.0820 (0.0964)  lr: 0.000005  time: 0.6479  data: 0.0073  max mem: 4978
Epoch: [10]  [ 130/2375]  eta: 0:24:26  loss_objectness: 0.0037 (0.0074)  loss: 0.3836 (0.4338)  loss_rpn_box_reg: 0.0112 (0.0140)  loss_mask: 0.2230 (0.2391)  loss_box_reg: 0.0835 (0.0780)  loss_classifier: 0.0820 (0.0953)  lr: 0.000005  time: 0.6514  data: 0.0072  max mem: 4978
some issue here. skipping.
Epoch: [11]  [   0/2375]  eta: 0:36:15  loss_objectness: 0.0041 (0.0041)  loss: 0.4713 (0.4713)  loss_rpn_box_reg: 0.0388 (0.0388)  loss_mask: 0.2826 (0.2826)  loss_box_reg: 0.0929 (0.0929)  loss_classifier: 0.0529 (0.0529)  lr: 0.000005  time: 0.9158  data: 0.2662  max mem: 4978
Epoch: [11]  [  10/2375]  eta: 0:26:17  loss_objectness: 0.0031 (0.0064)  loss: 0.4713 (0.4480)  loss_rpn_box_reg: 0.0108 (0.0146)  loss_mask: 0.2537 (0.2516)  loss_box_reg: 0.0879 (0.0790)  loss_classifier: 0.0971 (0.0964)  lr: 0.000005  time: 0.6670  data: 0.0307  max mem: 4978
Epoch: [11]  [  20/2375]  eta: 0:25:46  loss_objectness: 0.0047 (0.0068)  loss: 0.4392 (0.4453)  loss_rpn_box_reg: 0.0108 (0.0150)  loss_mask: 0.2438 (0.2466)  loss_box_reg: 0.0827 (0.0812)  loss_classifier: 0.0958 (0.0957)  lr: 0.000005  time: 0.6438  data: 0.0073  max mem: 4978
Epoch: [11]  [  30/2375]  eta: 0:25:42  loss_objectness: 0.0047 (0.0066)  loss: 0.4298 (0.4505)  loss_rpn_box_reg: 0.0128 (0.0163)  loss_mask: 0.2406 (0.2438)  loss_box_reg: 0.0872 (0.0865)  loss_classifier: 0.0936 (0.0974)  lr: 0.000005  time: 0.6524  data: 0.0074  max mem: 4978
Epoch: [11]  [  40/2375]  eta: 0:25:22  loss_objectness: 0.0022 (0.0062)  loss: 0.4179 (0.4408)  loss_rpn_box_reg: 0.0113 (0.0149)  loss_mask: 0.2258 (0.2413)  loss_box_reg: 0.0723 (0.0828)  loss_classifier: 0.0872 (0.0956)  lr: 0.000005  time: 0.6474  data: 0.0072  max mem: 4978
Epoch: [11]  [  50/2375]  eta: 0:25:15  loss_objectness: 0.0029 (0.0064)  loss: 0.4167 (0.4460)  loss_rpn_box_reg: 0.0099 (0.0153)  loss_mask: 0.2258 (0.2419)  loss_box_reg: 0.0714 (0.0838)  loss_classifier: 0.0892 (0.0987)  lr: 0.000005  time: 0.6426  data: 0.0072  max mem: 4978
Epoch: [11]  [  60/2375]  eta: 0:25:05  loss_objectness: 0.0048 (0.0068)  loss: 0.4828 (0.4506)  loss_rpn_box_reg: 0.0108 (0.0151)  loss_mask: 0.2392 (0.2439)  loss_box_reg: 0.0814 (0.0843)  loss_classifier: 0.1075 (0.1006)  lr: 0.000005  time: 0.6467  data: 0.0072  max mem: 4978
Epoch: [11]  [  70/2375]  eta: 0:24:59  loss_objectness: 0.0070 (0.0078)  loss: 0.4895 (0.4575)  loss_rpn_box_reg: 0.0120 (0.0150)  loss_mask: 0.2475 (0.2451)  loss_box_reg: 0.0912 (0.0862)  loss_classifier: 0.1107 (0.1033)  lr: 0.000005  time: 0.6476  data: 0.0073  max mem: 4978
Epoch: [11]  [  80/2375]  eta: 0:24:48  loss_objectness: 0.0066 (0.0075)  loss: 0.4501 (0.4470)  loss_rpn_box_reg: 0.0090 (0.0148)  loss_mask: 0.2449 (0.2423)  loss_box_reg: 0.0749 (0.0825)  loss_classifier: 0.1049 (0.0999)  lr: 0.000005  time: 0.6428  data: 0.0072  max mem: 4978
some issue here. skipping.
Epoch: [12]  [   0/2375]  eta: 0:40:53  loss_objectness: 0.0094 (0.0094)  loss: 0.4699 (0.4699)  loss_rpn_box_reg: 0.0143 (0.0143)  loss_mask: 0.2437 (0.2437)  loss_box_reg: 0.1034 (0.1034)  loss_classifier: 0.0991 (0.0991)  lr: 0.000001  time: 1.0332  data: 0.3703  max mem: 4978
Epoch: [12]  [  10/2375]  eta: 0:26:34  loss_objectness: 0.0094 (0.0115)  loss: 0.4624 (0.4447)  loss_rpn_box_reg: 0.0109 (0.0111)  loss_mask: 0.2437 (0.2529)  loss_box_reg: 0.0785 (0.0739)  loss_classifier: 0.0991 (0.0954)  lr: 0.000001  time: 0.6741  data: 0.0402  max mem: 4978
Epoch: [12]  [  20/2375]  eta: 0:25:54  loss_objectness: 0.0069 (0.0088)  loss: 0.4591 (0.4494)  loss_rpn_box_reg: 0.0101 (0.0118)  loss_mask: 0.2460 (0.2488)  loss_box_reg: 0.0830 (0.0820)  loss_classifier: 0.0991 (0.0980)  lr: 0.000001  time: 0.6414  data: 0.0073  max mem: 4978
Epoch: [12]  [  30/2375]  eta: 0:25:37  loss_objectness: 0.0043 (0.0085)  loss: 0.4488 (0.4348)  loss_rpn_box_reg: 0.0091 (0.0111)  loss_mask: 0.2309 (0.2435)  loss_box_reg: 0.0830 (0.0774)  loss_classifier: 0.0941 (0.0944)  lr: 0.000001  time: 0.6457  data: 0.0074  max mem: 4978
Epoch: [12]  [  40/2375]  eta: 0:25:28  loss_objectness: 0.0027 (0.0078)  loss: 0.4422 (0.4423)  loss_rpn_box_reg: 0.0091 (0.0115)  loss_mask: 0.2275 (0.2479)  loss_box_reg: 0.0738 (0.0792)  loss_classifier: 0.0844 (0.0959)  lr: 0.000001  time: 0.6492  data: 0.0074  max mem: 4978
Epoch: [12]  [  50/2375]  eta: 0:25:23  loss_objectness: 0.0026 (0.0081)  loss: 0.4101 (0.4393)  loss_rpn_box_reg: 0.0116 (0.0118)  loss_mask: 0.2566 (0.2463)  loss_box_reg: 0.0738 (0.0788)  loss_classifier: 0.0835 (0.0943)  lr: 0.000001  time: 0.6545  data: 0.0073  max mem: 4978
Epoch: [12]  [  60/2375]  eta: 0:25:13  loss_objectness: 0.0042 (0.0077)  loss: 0.4039 (0.4402)  loss_rpn_box_reg: 0.0123 (0.0126)  loss_mask: 0.2389 (0.2459)  loss_box_reg: 0.0776 (0.0791)  loss_classifier: 0.0802 (0.0950)  lr: 0.000001  time: 0.6524  data: 0.0076  max mem: 4978
Epoch: [12]  [  70/2375]  eta: 0:25:06  loss_objectness: 0.0049 (0.0074)  loss: 0.4293 (0.4427)  loss_rpn_box_reg: 0.0151 (0.0131)  loss_mask: 0.2377 (0.2460)  loss_box_reg: 0.0799 (0.0807)  loss_classifier: 0.0940 (0.0956)  lr: 0.000001  time: 0.6486  data: 0.0077  max mem: 4978
Epoch: [12]  [  80/2375]  eta: 0:24:56  loss_objectness: 0.0052 (0.0075)  loss: 0.4282 (0.4403)  loss_rpn_box_reg: 0.0124 (0.0129)  loss_mask: 0.2357 (0.2456)  loss_box_reg: 0.0777 (0.0793)  loss_classifier: 0.0986 (0.0950)  lr: 0.000001  time: 0.6458  data: 0.0074  max mem: 4978
Epoch: [12]  [  90/2375]  eta: 0:24:47  loss_objectness: 0.0052 (0.0075)  loss: 0.4245 (0.4429)  loss_rpn_box_reg: 0.0103 (0.0130)  loss_mask: 0.2305 (0.2455)  loss_box_reg: 0.0814 (0.0806)  loss_classifier: 0.0984 (0.0963)  lr: 0.000001  time: 0.6418  data: 0.0074  max mem: 4978
Epoch: [12]  [ 100/2375]  eta: 0:24:39  loss_objectness: 0.0044 (0.0073)  loss: 0.4374 (0.4403)  loss_rpn_box_reg: 0.0107 (0.0129)  loss_mask: 0.2305 (0.2443)  loss_box_reg: 0.0883 (0.0804)  loss_classifier: 0.0949 (0.0954)  lr: 0.000001  time: 0.6439  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [13]  [   0/2375]  eta: 0:37:58  loss_objectness: 0.0089 (0.0089)  loss: 0.4137 (0.4137)  loss_rpn_box_reg: 0.0087 (0.0087)  loss_mask: 0.2148 (0.2148)  loss_box_reg: 0.0662 (0.0662)  loss_classifier: 0.1150 (0.1150)  lr: 0.000001  time: 0.9594  data: 0.3070  max mem: 4978
Epoch: [13]  [  10/2375]  eta: 0:26:42  loss_objectness: 0.0038 (0.0092)  loss: 0.4023 (0.4091)  loss_rpn_box_reg: 0.0088 (0.0201)  loss_mask: 0.2182 (0.2227)  loss_box_reg: 0.0581 (0.0682)  loss_classifier: 0.0890 (0.0888)  lr: 0.000001  time: 0.6775  data: 0.0346  max mem: 4978
Epoch: [13]  [  20/2375]  eta: 0:26:02  loss_objectness: 0.0038 (0.0070)  loss: 0.4023 (0.4159)  loss_rpn_box_reg: 0.0088 (0.0162)  loss_mask: 0.2201 (0.2307)  loss_box_reg: 0.0581 (0.0713)  loss_classifier: 0.0788 (0.0907)  lr: 0.000001  time: 0.6485  data: 0.0074  max mem: 4978
Epoch: [13]  [  30/2375]  eta: 0:25:37  loss_objectness: 0.0040 (0.0070)  loss: 0.3997 (0.4161)  loss_rpn_box_reg: 0.0096 (0.0149)  loss_mask: 0.2237 (0.2337)  loss_box_reg: 0.0580 (0.0692)  loss_classifier: 0.0812 (0.0913)  lr: 0.000001  time: 0.6434  data: 0.0073  max mem: 4978
Epoch: [13]  [  40/2375]  eta: 0:25:19  loss_objectness: 0.0035 (0.0081)  loss: 0.3789 (0.4159)  loss_rpn_box_reg: 0.0100 (0.0143)  loss_mask: 0.2312 (0.2358)  loss_box_reg: 0.0531 (0.0687)  loss_classifier: 0.0770 (0.0889)  lr: 0.000001  time: 0.6380  data: 0.0072  max mem: 4978
Epoch: [13]  [  50/2375]  eta: 0:25:12  loss_objectness: 0.0076 (0.0087)  loss: 0.4214 (0.4269)  loss_rpn_box_reg: 0.0124 (0.0145)  loss_mask: 0.2383 (0.2378)  loss_box_reg: 0.0661 (0.0732)  loss_classifier: 0.0937 (0.0927)  lr: 0.000001  time: 0.6429  data: 0.0073  max mem: 4978
some issue here. skipping.
Epoch: [14]  [   0/2375]  eta: 0:34:26  loss_objectness: 0.0044 (0.0044)  loss: 0.3042 (0.3042)  loss_rpn_box_reg: 0.0068 (0.0068)  loss_mask: 0.1922 (0.1922)  loss_box_reg: 0.0422 (0.0422)  loss_classifier: 0.0587 (0.0587)  lr: 0.000001  time: 0.8700  data: 0.2160  max mem: 4978
Epoch: [14]  [  10/2375]  eta: 0:26:13  loss_objectness: 0.0063 (0.0086)  loss: 0.4569 (0.4572)  loss_rpn_box_reg: 0.0120 (0.0118)  loss_mask: 0.2541 (0.2459)  loss_box_reg: 0.0835 (0.0812)  loss_classifier: 0.1076 (0.1098)  lr: 0.000001  time: 0.6651  data: 0.0263  max mem: 4978
Epoch: [14]  [  20/2375]  eta: 0:25:44  loss_objectness: 0.0052 (0.0082)  loss: 0.4437 (0.4456)  loss_rpn_box_reg: 0.0102 (0.0107)  loss_mask: 0.2349 (0.2372)  loss_box_reg: 0.0815 (0.0821)  loss_classifier: 0.1091 (0.1073)  lr: 0.000001  time: 0.6452  data: 0.0073  max mem: 4978
Epoch: [14]  [  30/2375]  eta: 0:25:35  loss_objectness: 0.0052 (0.0120)  loss: 0.4042 (0.4428)  loss_rpn_box_reg: 0.0084 (0.0119)  loss_mask: 0.2219 (0.2376)  loss_box_reg: 0.0751 (0.0788)  loss_classifier: 0.1054 (0.1026)  lr: 0.000001  time: 0.6493  data: 0.0073  max mem: 4978
Epoch: [14]  [  40/2375]  eta: 0:25:23  loss_objectness: 0.0046 (0.0109)  loss: 0.4047 (0.4465)  loss_rpn_box_reg: 0.0128 (0.0141)  loss_mask: 0.2245 (0.2400)  loss_box_reg: 0.0718 (0.0806)  loss_classifier: 0.0898 (0.1009)  lr: 0.000001  time: 0.6490  data: 0.0073  max mem: 4978
Epoch: [14]  [  50/2375]  eta: 0:25:10  loss_objectness: 0.0072 (0.0130)  loss: 0.4231 (0.4467)  loss_rpn_box_reg: 0.0150 (0.0151)  loss_mask: 0.2226 (0.2393)  loss_box_reg: 0.0738 (0.0783)  loss_classifier: 0.0891 (0.1011)  lr: 0.000001  time: 0.6418  data: 0.0072  max mem: 4978
Epoch: [14]  [  60/2375]  eta: 0:25:02  loss_objectness: 0.0056 (0.0117)  loss: 0.4078 (0.4421)  loss_rpn_box_reg: 0.0134 (0.0153)  loss_mask: 0.2226 (0.2384)  loss_box_reg: 0.0739 (0.0781)  loss_classifier: 0.0814 (0.0986)  lr: 0.000001  time: 0.6417  data: 0.0072  max mem: 4978
Epoch: [14]  [  70/2375]  eta: 0:24:54  loss_objectness: 0.0047 (0.0114)  loss: 0.4182 (0.4404)  loss_rpn_box_reg: 0.0080 (0.0150)  loss_mask: 0.2256 (0.2377)  loss_box_reg: 0.0743 (0.0781)  loss_classifier: 0.0842 (0.0982)  lr: 0.000001  time: 0.6454  data: 0.0072  max mem: 4978
Epoch: [14]  [  80/2375]  eta: 0:24:47  loss_objectness: 0.0036 (0.0107)  loss: 0.4278 (0.4382)  loss_rpn_box_reg: 0.0089 (0.0145)  loss_mask: 0.2346 (0.2383)  loss_box_reg: 0.0697 (0.0773)  loss_classifier: 0.0927 (0.0974)  lr: 0.000001  time: 0.6457  data: 0.0072  max mem: 4978
Epoch: [14]  [  90/2375]  eta: 0:24:40  loss_objectness: 0.0036 (0.0103)  loss: 0.4278 (0.4394)  loss_rpn_box_reg: 0.0115 (0.0144)  loss_mask: 0.2326 (0.2405)  loss_box_reg: 0.0697 (0.0769)  loss_classifier: 0.0927 (0.0973)  lr: 0.000001  time: 0.6451  data: 0.0072  max mem: 4978
Epoch: [14]  [ 100/2375]  eta: 0:24:31  loss_objectness: 0.0040 (0.0100)  loss: 0.3660 (0.4382)  loss_rpn_box_reg: 0.0104 (0.0141)  loss_mask: 0.2218 (0.2397)  loss_box_reg: 0.0724 (0.0775)  loss_classifier: 0.0783 (0.0968)  lr: 0.000001  time: 0.6415  data: 0.0075  max mem: 4978
Epoch: [14]  [ 110/2375]  eta: 0:24:24  loss_objectness: 0.0036 (0.0096)  loss: 0.3685 (0.4355)  loss_rpn_box_reg: 0.0095 (0.0137)  loss_mask: 0.2218 (0.2390)  loss_box_reg: 0.0695 (0.0764)  loss_classifier: 0.0797 (0.0968)  lr: 0.000001  time: 0.6408  data: 0.0075  max mem: 4978
Epoch: [14]  [ 120/2375]  eta: 0:24:17  loss_objectness: 0.0033 (0.0092)  loss: 0.3645 (0.4312)  loss_rpn_box_reg: 0.0087 (0.0134)  loss_mask: 0.2232 (0.2383)  loss_box_reg: 0.0553 (0.0747)  loss_classifier: 0.0715 (0.0956)  lr: 0.000001  time: 0.6437  data: 0.0074  max mem: 4978
Epoch: [14]  [ 130/2375]  eta: 0:24:11  loss_objectness: 0.0027 (0.0104)  loss: 0.3763 (0.4316)  loss_rpn_box_reg: 0.0086 (0.0137)  loss_mask: 0.2355 (0.2378)  loss_box_reg: 0.0544 (0.0740)  loss_classifier: 0.0788 (0.0956)  lr: 0.000001  time: 0.6465  data: 0.0075  max mem: 4978
Epoch: [14]  [ 140/2375]  eta: 0:24:05  loss_objectness: 0.0025 (0.0103)  loss: 0.4063 (0.4339)  loss_rpn_box_reg: 0.0106 (0.0137)  loss_mask: 0.2355 (0.2389)  loss_box_reg: 0.0680 (0.0749)  loss_classifier: 0.0813 (0.0960)  lr: 0.000001  time: 0.6489  data: 0.0076  max mem: 4978
Epoch: [14]  [ 150/2375]  eta: 0:23:58  loss_objectness: 0.0043 (0.0100)  loss: 0.4169 (0.4342)  loss_rpn_box_reg: 0.0113 (0.0138)  loss_mask: 0.2465 (0.2388)  loss_box_reg: 0.0723 (0.0751)  loss_classifier: 0.0916 (0.0964)  lr: 0.000001  time: 0.6463  data: 0.0076  max mem: 4978
Epoch: [14]  [ 160/2375]  eta: 0:23:51  loss_objectness: 0.0045 (0.0098)  loss: 0.4169 (0.4326)  loss_rpn_box_reg: 0.0128 (0.0139)  loss_mask: 0.2193 (0.2377)  loss_box_reg: 0.0723 (0.0752)  loss_classifier: 0.0916 (0.0960)  lr: 0.000001  time: 0.6431  data: 0.0076  max mem: 4978
Epoch: [14]  [ 170/2375]  eta: 0:23:43  loss_objectness: 0.0059 (0.0103)  loss: 0.4319 (0.4365)  loss_rpn_box_reg: 0.0117 (0.0138)  loss_mask: 0.2445 (0.2391)  loss_box_reg: 0.0837 (0.0762)  loss_classifier: 0.0985 (0.0970)  lr: 0.000001  time: 0.6393  data: 0.0075  max mem: 4978
Epoch: [14]  [ 180/2375]  eta: 0:23:37  loss_objectness: 0.0112 (0.0104)  loss: 0.4637 (0.4376)  loss_rpn_box_reg: 0.0101 (0.0139)  loss_mask: 0.2445 (0.2391)  loss_box_reg: 0.0861 (0.0764)  loss_classifier: 0.1090 (0.0978)  lr: 0.000001  time: 0.6422  data: 0.0074  max mem: 4978
some issue here. skipping.
Epoch: [15]  [   0/2375]  eta: 0:37:25  loss_objectness: 0.0014 (0.0014)  loss: 0.3562 (0.3562)  loss_rpn_box_reg: 0.0128 (0.0128)  loss_mask: 0.2339 (0.2339)  loss_box_reg: 0.0515 (0.0515)  loss_classifier: 0.0567 (0.0567)  lr: 0.000000  time: 0.9453  data: 0.3002  max mem: 4978
Epoch: [15]  [  10/2375]  eta: 0:26:21  loss_objectness: 0.0036 (0.0054)  loss: 0.4390 (0.4267)  loss_rpn_box_reg: 0.0128 (0.0119)  loss_mask: 0.2349 (0.2333)  loss_box_reg: 0.0700 (0.0807)  loss_classifier: 0.0879 (0.0954)  lr: 0.000000  time: 0.6688  data: 0.0340  max mem: 4978
Epoch: [15]  [  20/2375]  eta: 0:25:44  loss_objectness: 0.0066 (0.0084)  loss: 0.3975 (0.4483)  loss_rpn_box_reg: 0.0130 (0.0139)  loss_mask: 0.2379 (0.2510)  loss_box_reg: 0.0688 (0.0803)  loss_classifier: 0.0874 (0.0947)  lr: 0.000000  time: 0.6412  data: 0.0074  max mem: 4978
Epoch: [15]  [  30/2375]  eta: 0:25:28  loss_objectness: 0.0054 (0.0073)  loss: 0.3913 (0.4376)  loss_rpn_box_reg: 0.0098 (0.0133)  loss_mask: 0.2379 (0.2434)  loss_box_reg: 0.0680 (0.0787)  loss_classifier: 0.0812 (0.0950)  lr: 0.000000  time: 0.6428  data: 0.0074  max mem: 4978
Epoch: [15]  [  40/2375]  eta: 0:25:16  loss_objectness: 0.0042 (0.0071)  loss: 0.3773 (0.4319)  loss_rpn_box_reg: 0.0089 (0.0133)  loss_mask: 0.2402 (0.2417)  loss_box_reg: 0.0680 (0.0767)  loss_classifier: 0.0812 (0.0930)  lr: 0.000000  time: 0.6431  data: 0.0077  max mem: 4978
Epoch: [15]  [  50/2375]  eta: 0:25:07  loss_objectness: 0.0046 (0.0069)  loss: 0.3964 (0.4262)  loss_rpn_box_reg: 0.0102 (0.0128)  loss_mask: 0.2311 (0.2388)  loss_box_reg: 0.0649 (0.0767)  loss_classifier: 0.0808 (0.0910)  lr: 0.000000  time: 0.6432  data: 0.0076  max mem: 4978
Epoch: [15]  [  60/2375]  eta: 0:25:01  loss_objectness: 0.0050 (0.0086)  loss: 0.4491 (0.4329)  loss_rpn_box_reg: 0.0104 (0.0128)  loss_mask: 0.2396 (0.2405)  loss_box_reg: 0.0833 (0.0780)  loss_classifier: 0.0880 (0.0930)  lr: 0.000000  time: 0.6470  data: 0.0076  max mem: 4978
Epoch: [15]  [  70/2375]  eta: 0:24:55  loss_objectness: 0.0031 (0.0083)  loss: 0.4693 (0.4368)  loss_rpn_box_reg: 0.0115 (0.0126)  loss_mask: 0.2428 (0.2428)  loss_box_reg: 0.0822 (0.0783)  loss_classifier: 0.1022 (0.0948)  lr: 0.000000  time: 0.6488  data: 0.0077  max mem: 4978
Epoch: [15]  [  80/2375]  eta: 0:24:48  loss_objectness: 0.0031 (0.0084)  loss: 0.4419 (0.4397)  loss_rpn_box_reg: 0.0121 (0.0127)  loss_mask: 0.2346 (0.2440)  loss_box_reg: 0.0784 (0.0793)  loss_classifier: 0.1006 (0.0954)  lr: 0.000000  time: 0.6488  data: 0.0078  max mem: 4978
Epoch: [15]  [  90/2375]  eta: 0:24:42  loss_objectness: 0.0036 (0.0083)  loss: 0.4347 (0.4375)  loss_rpn_box_reg: 0.0105 (0.0128)  loss_mask: 0.2364 (0.2428)  loss_box_reg: 0.0741 (0.0785)  loss_classifier: 0.0948 (0.0951)  lr: 0.000000  time: 0.6490  data: 0.0078  max mem: 4978
Epoch: [15]  [ 100/2375]  eta: 0:24:35  loss_objectness: 0.0036 (0.0083)  loss: 0.4076 (0.4332)  loss_rpn_box_reg: 0.0105 (0.0131)  loss_mask: 0.2259 (0.2401)  loss_box_reg: 0.0602 (0.0769)  loss_classifier: 0.0803 (0.0948)  lr: 0.000000  time: 0.6479  data: 0.0074  max mem: 4978
Epoch: [15]  [ 110/2375]  eta: 0:24:31  loss_objectness: 0.0047 (0.0082)  loss: 0.4239 (0.4337)  loss_rpn_box_reg: 0.0109 (0.0134)  loss_mask: 0.2334 (0.2399)  loss_box_reg: 0.0607 (0.0774)  loss_classifier: 0.0912 (0.0948)  lr: 0.000000  time: 0.6538  data: 0.0074  max mem: 4978
Epoch: [15]  [ 120/2375]  eta: 0:24:23  loss_objectness: 0.0069 (0.0084)  loss: 0.4619 (0.4382)  loss_rpn_box_reg: 0.0113 (0.0134)  loss_mask: 0.2506 (0.2421)  loss_box_reg: 0.0862 (0.0783)  loss_classifier: 0.0959 (0.0961)  lr: 0.000000  time: 0.6513  data: 0.0075  max mem: 4978
Epoch: [15]  [ 130/2375]  eta: 0:24:16  loss_objectness: 0.0069 (0.0084)  loss: 0.4619 (0.4381)  loss_rpn_box_reg: 0.0124 (0.0133)  loss_mask: 0.2427 (0.2418)  loss_box_reg: 0.0799 (0.0780)  loss_classifier: 0.1055 (0.0965)  lr: 0.000000  time: 0.6439  data: 0.0076  max mem: 4978
Epoch: [15]  [ 140/2375]  eta: 0:24:08  loss_objectness: 0.0053 (0.0088)  loss: 0.4520 (0.4432)  loss_rpn_box_reg: 0.0131 (0.0138)  loss_mask: 0.2300 (0.2427)  loss_box_reg: 0.0790 (0.0795)  loss_classifier: 0.1008 (0.0984)  lr: 0.000000  time: 0.6413  data: 0.0075  max mem: 4978
Epoch: [15]  [ 150/2375]  eta: 0:24:00  loss_objectness: 0.0065 (0.0087)  loss: 0.4404 (0.4420)  loss_rpn_box_reg: 0.0128 (0.0136)  loss_mask: 0.2440 (0.2428)  loss_box_reg: 0.0701 (0.0787)  loss_classifier: 0.0932 (0.0981)  lr: 0.000000  time: 0.6381  data: 0.0075  max mem: 4978
Epoch: [15]  [ 160/2375]  eta: 0:23:53  loss_objectness: 0.0060 (0.0086)  loss: 0.4167 (0.4415)  loss_rpn_box_reg: 0.0121 (0.0135)  loss_mask: 0.2437 (0.2422)  loss_box_reg: 0.0700 (0.0792)  loss_classifier: 0.0870 (0.0979)  lr: 0.000000  time: 0.6428  data: 0.0075  max mem: 4978
some issue here. skipping.
Epoch: [16]  [   0/2375]  eta: 0:35:20  loss_objectness: 0.0042 (0.0042)  loss: 0.4321 (0.4321)  loss_rpn_box_reg: 0.0170 (0.0170)  loss_mask: 0.2248 (0.2248)  loss_box_reg: 0.0841 (0.0841)  loss_classifier: 0.1020 (0.1020)  lr: 0.000000  time: 0.8929  data: 0.2279  max mem: 4978
Epoch: [16]  [  10/2375]  eta: 0:26:19  loss_objectness: 0.0038 (0.0075)  loss: 0.4321 (0.4465)  loss_rpn_box_reg: 0.0145 (0.0186)  loss_mask: 0.2248 (0.2445)  loss_box_reg: 0.0816 (0.0815)  loss_classifier: 0.0890 (0.0944)  lr: 0.000000  time: 0.6679  data: 0.0275  max mem: 4978
Epoch: [16]  [  20/2375]  eta: 0:25:49  loss_objectness: 0.0054 (0.0068)  loss: 0.4152 (0.4315)  loss_rpn_box_reg: 0.0125 (0.0168)  loss_mask: 0.2247 (0.2376)  loss_box_reg: 0.0809 (0.0765)  loss_classifier: 0.0890 (0.0938)  lr: 0.000000  time: 0.6463  data: 0.0077  max mem: 4978
Epoch: [16]  [  30/2375]  eta: 0:25:27  loss_objectness: 0.0060 (0.0081)  loss: 0.3980 (0.4495)  loss_rpn_box_reg: 0.0107 (0.0151)  loss_mask: 0.2296 (0.2483)  loss_box_reg: 0.0658 (0.0791)  loss_classifier: 0.0985 (0.0989)  lr: 0.000000  time: 0.6423  data: 0.0075  max mem: 4978
Epoch: [16]  [  40/2375]  eta: 0:25:22  loss_objectness: 0.0030 (0.0070)  loss: 0.4123 (0.4460)  loss_rpn_box_reg: 0.0102 (0.0142)  loss_mask: 0.2448 (0.2486)  loss_box_reg: 0.0658 (0.0805)  loss_classifier: 0.0774 (0.0958)  lr: 0.000000  time: 0.6456  data: 0.0071  max mem: 4978
Epoch: [16]  [  50/2375]  eta: 0:25:13  loss_objectness: 0.0024 (0.0069)  loss: 0.4230 (0.4373)  loss_rpn_box_reg: 0.0102 (0.0135)  loss_mask: 0.2448 (0.2460)  loss_box_reg: 0.0716 (0.0778)  loss_classifier: 0.0775 (0.0931)  lr: 0.000000  time: 0.6505  data: 0.0072  max mem: 4978
Epoch: [16]  [  60/2375]  eta: 0:25:02  loss_objectness: 0.0049 (0.0069)  loss: 0.4250 (0.4422)  loss_rpn_box_reg: 0.0116 (0.0139)  loss_mask: 0.2408 (0.2476)  loss_box_reg: 0.0716 (0.0787)  loss_classifier: 0.0872 (0.0951)  lr: 0.000000  time: 0.6425  data: 0.0073  max mem: 4978
Epoch: [16]  [  70/2375]  eta: 0:24:54  loss_objectness: 0.0047 (0.0067)  loss: 0.4241 (0.4459)  loss_rpn_box_reg: 0.0125 (0.0136)  loss_mask: 0.2408 (0.2494)  loss_box_reg: 0.0785 (0.0804)  loss_classifier: 0.0959 (0.0957)  lr: 0.000000  time: 0.6410  data: 0.0073  max mem: 4978
Epoch: [16]  [  80/2375]  eta: 0:24:47  loss_objectness: 0.0026 (0.0067)  loss: 0.4241 (0.4477)  loss_rpn_box_reg: 0.0144 (0.0136)  loss_mask: 0.2456 (0.2500)  loss_box_reg: 0.0855 (0.0811)  loss_classifier: 0.0931 (0.0963)  lr: 0.000000  time: 0.6465  data: 0.0073  max mem: 4978
Epoch: [16]  [  90/2375]  eta: 0:24:40  loss_objectness: 0.0030 (0.0065)  loss: 0.4359 (0.4486)  loss_rpn_box_reg: 0.0140 (0.0136)  loss_mask: 0.2429 (0.2489)  loss_box_reg: 0.0919 (0.0827)  loss_classifier: 0.1029 (0.0968)  lr: 0.000000  time: 0.6472  data: 0.0073  max mem: 4978
Epoch: [16]  [ 100/2375]  eta: 0:24:32  loss_objectness: 0.0044 (0.0069)  loss: 0.4301 (0.4472)  loss_rpn_box_reg: 0.0115 (0.0135)  loss_mask: 0.2220 (0.2473)  loss_box_reg: 0.0771 (0.0821)  loss_classifier: 0.1059 (0.0974)  lr: 0.000000  time: 0.6436  data: 0.0072  max mem: 4978
Epoch: [16]  [ 110/2375]  eta: 0:24:25  loss_objectness: 0.0044 (0.0069)  loss: 0.4202 (0.4446)  loss_rpn_box_reg: 0.0105 (0.0133)  loss_mask: 0.2151 (0.2463)  loss_box_reg: 0.0736 (0.0817)  loss_classifier: 0.0915 (0.0964)  lr: 0.000000  time: 0.6418  data: 0.0072  max mem: 4978
Epoch: [16]  [ 120/2375]  eta: 0:24:20  loss_objectness: 0.0058 (0.0079)  loss: 0.4175 (0.4449)  loss_rpn_box_reg: 0.0116 (0.0136)  loss_mask: 0.2268 (0.2460)  loss_box_reg: 0.0773 (0.0816)  loss_classifier: 0.0835 (0.0959)  lr: 0.000000  time: 0.6501  data: 0.0072  max mem: 4978
Epoch: [16]  [ 130/2375]  eta: 0:24:14  loss_objectness: 0.0063 (0.0079)  loss: 0.4159 (0.4442)  loss_rpn_box_reg: 0.0133 (0.0135)  loss_mask: 0.2268 (0.2458)  loss_box_reg: 0.0773 (0.0811)  loss_classifier: 0.0835 (0.0959)  lr: 0.000000  time: 0.6530  data: 0.0072  max mem: 4978
Epoch: [16]  [ 140/2375]  eta: 0:24:07  loss_objectness: 0.0059 (0.0078)  loss: 0.4138 (0.4446)  loss_rpn_box_reg: 0.0133 (0.0139)  loss_mask: 0.2419 (0.2458)  loss_box_reg: 0.0768 (0.0810)  loss_classifier: 0.0906 (0.0961)  lr: 0.000000  time: 0.6470  data: 0.0071  max mem: 4978
Epoch: [16]  [ 150/2375]  eta: 0:24:01  loss_objectness: 0.0026 (0.0075)  loss: 0.4143 (0.4420)  loss_rpn_box_reg: 0.0138 (0.0139)  loss_mask: 0.2453 (0.2454)  loss_box_reg: 0.0654 (0.0804)  loss_classifier: 0.0728 (0.0949)  lr: 0.000000  time: 0.6481  data: 0.0072  max mem: 4978
Epoch: [16]  [ 160/2375]  eta: 0:23:56  loss_objectness: 0.0029 (0.0079)  loss: 0.4185 (0.4431)  loss_rpn_box_reg: 0.0134 (0.0140)  loss_mask: 0.2429 (0.2456)  loss_box_reg: 0.0706 (0.0805)  loss_classifier: 0.0725 (0.0951)  lr: 0.000000  time: 0.6555  data: 0.0073  max mem: 4978
Epoch: [16]  [ 170/2375]  eta: 0:23:49  loss_objectness: 0.0063 (0.0085)  loss: 0.4756 (0.4470)  loss_rpn_box_reg: 0.0157 (0.0152)  loss_mask: 0.2356 (0.2455)  loss_box_reg: 0.0886 (0.0815)  loss_classifier: 0.1077 (0.0964)  lr: 0.000000  time: 0.6515  data: 0.0073  max mem: 4978
Epoch: [16]  [ 180/2375]  eta: 0:23:42  loss_objectness: 0.0043 (0.0084)  loss: 0.4755 (0.4483)  loss_rpn_box_reg: 0.0130 (0.0150)  loss_mask: 0.2441 (0.2459)  loss_box_reg: 0.0945 (0.0821)  loss_classifier: 0.1039 (0.0969)  lr: 0.000000  time: 0.6445  data: 0.0073  max mem: 4978
Epoch: [16]  [ 190/2375]  eta: 0:23:36  loss_objectness: 0.0057 (0.0085)  loss: 0.4665 (0.4504)  loss_rpn_box_reg: 0.0118 (0.0149)  loss_mask: 0.2441 (0.2464)  loss_box_reg: 0.0876 (0.0826)  loss_classifier: 0.1006 (0.0979)  lr: 0.000000  time: 0.6463  data: 0.0072  max mem: 4978
Epoch: [16]  [ 200/2375]  eta: 0:23:30  loss_objectness: 0.0067 (0.0085)  loss: 0.4552 (0.4512)  loss_rpn_box_reg: 0.0121 (0.0148)  loss_mask: 0.2354 (0.2466)  loss_box_reg: 0.0847 (0.0829)  loss_classifier: 0.1058 (0.0983)  lr: 0.000000  time: 0.6493  data: 0.0071  max mem: 4978
Epoch: [16]  [ 210/2375]  eta: 0:23:23  loss_objectness: 0.0034 (0.0089)  loss: 0.3964 (0.4498)  loss_rpn_box_reg: 0.0123 (0.0147)  loss_mask: 0.2244 (0.2459)  loss_box_reg: 0.0752 (0.0824)  loss_classifier: 0.0897 (0.0979)  lr: 0.000000  time: 0.6495  data: 0.0072  max mem: 4978
Epoch: [16]  [ 220/2375]  eta: 0:23:16  loss_objectness: 0.0033 (0.0089)  loss: 0.3813 (0.4489)  loss_rpn_box_reg: 0.0123 (0.0146)  loss_mask: 0.2221 (0.2454)  loss_box_reg: 0.0672 (0.0820)  loss_classifier: 0.0885 (0.0979)  lr: 0.000000  time: 0.6450  data: 0.0075  max mem: 4978
Epoch: [16]  [ 230/2375]  eta: 0:23:09  loss_objectness: 0.0032 (0.0087)  loss: 0.3813 (0.4468)  loss_rpn_box_reg: 0.0105 (0.0146)  loss_mask: 0.2221 (0.2446)  loss_box_reg: 0.0707 (0.0816)  loss_classifier: 0.0814 (0.0973)  lr: 0.000000  time: 0.6456  data: 0.0075  max mem: 4978
Epoch: [16]  [ 240/2375]  eta: 0:23:04  loss_objectness: 0.0037 (0.0091)  loss: 0.4649 (0.4506)  loss_rpn_box_reg: 0.0131 (0.0146)  loss_mask: 0.2379 (0.2459)  loss_box_reg: 0.0908 (0.0827)  loss_classifier: 0.0931 (0.0983)  lr: 0.000000  time: 0.6530  data: 0.0072  max mem: 4978
Epoch: [16]  [ 250/2375]  eta: 0:22:57  loss_objectness: 0.0067 (0.0096)  loss: 0.5148 (0.4513)  loss_rpn_box_reg: 0.0135 (0.0148)  loss_mask: 0.2521 (0.2461)  loss_box_reg: 0.0952 (0.0825)  loss_classifier: 0.1030 (0.0983)  lr: 0.000000  time: 0.6494  data: 0.0072  max mem: 4978
Epoch: [16]  [ 260/2375]  eta: 0:22:50  loss_objectness: 0.0085 (0.0096)  loss: 0.4032 (0.4512)  loss_rpn_box_reg: 0.0131 (0.0147)  loss_mask: 0.2153 (0.2460)  loss_box_reg: 0.0630 (0.0825)  loss_classifier: 0.0868 (0.0985)  lr: 0.000000  time: 0.6413  data: 0.0074  max mem: 4978
Epoch: [16]  [ 270/2375]  eta: 0:22:43  loss_objectness: 0.0068 (0.0097)  loss: 0.4052 (0.4514)  loss_rpn_box_reg: 0.0108 (0.0146)  loss_mask: 0.2260 (0.2461)  loss_box_reg: 0.0630 (0.0822)  loss_classifier: 0.0868 (0.0988)  lr: 0.000000  time: 0.6461  data: 0.0078  max mem: 4978
Epoch: [16]  [ 280/2375]  eta: 0:22:38  loss_objectness: 0.0059 (0.0096)  loss: 0.4335 (0.4505)  loss_rpn_box_reg: 0.0108 (0.0145)  loss_mask: 0.2270 (0.2456)  loss_box_reg: 0.0703 (0.0820)  loss_classifier: 0.0858 (0.0987)  lr: 0.000000  time: 0.6539  data: 0.0076  max mem: 4978
Epoch: [16]  [ 290/2375]  eta: 0:22:31  loss_objectness: 0.0059 (0.0098)  loss: 0.4238 (0.4511)  loss_rpn_box_reg: 0.0133 (0.0145)  loss_mask: 0.2363 (0.2460)  loss_box_reg: 0.0761 (0.0821)  loss_classifier: 0.0857 (0.0987)  lr: 0.000000  time: 0.6512  data: 0.0072  max mem: 4978
Epoch: [16]  [ 300/2375]  eta: 0:22:24  loss_objectness: 0.0041 (0.0096)  loss: 0.4084 (0.4492)  loss_rpn_box_reg: 0.0103 (0.0144)  loss_mask: 0.2227 (0.2457)  loss_box_reg: 0.0633 (0.0813)  loss_classifier: 0.0772 (0.0983)  lr: 0.000000  time: 0.6436  data: 0.0074  max mem: 4978
Epoch: [16]  [ 310/2375]  eta: 0:22:18  loss_objectness: 0.0025 (0.0097)  loss: 0.4065 (0.4493)  loss_rpn_box_reg: 0.0129 (0.0144)  loss_mask: 0.2151 (0.2455)  loss_box_reg: 0.0638 (0.0815)  loss_classifier: 0.0889 (0.0983)  lr: 0.000000  time: 0.6454  data: 0.0076  max mem: 4978
Epoch: [16]  [ 320/2375]  eta: 0:22:11  loss_objectness: 0.0026 (0.0095)  loss: 0.4065 (0.4472)  loss_rpn_box_reg: 0.0108 (0.0143)  loss_mask: 0.2318 (0.2449)  loss_box_reg: 0.0670 (0.0808)  loss_classifier: 0.0889 (0.0976)  lr: 0.000000  time: 0.6456  data: 0.0076  max mem: 4978
Epoch: [16]  [ 330/2375]  eta: 0:22:04  loss_objectness: 0.0039 (0.0094)  loss: 0.3962 (0.4470)  loss_rpn_box_reg: 0.0101 (0.0142)  loss_mask: 0.2318 (0.2450)  loss_box_reg: 0.0667 (0.0807)  loss_classifier: 0.0957 (0.0977)  lr: 0.000000  time: 0.6460  data: 0.0075  max mem: 4978
Epoch: [16]  [ 340/2375]  eta: 0:21:58  loss_objectness: 0.0064 (0.0094)  loss: 0.4481 (0.4480)  loss_rpn_box_reg: 0.0102 (0.0142)  loss_mask: 0.2356 (0.2452)  loss_box_reg: 0.0824 (0.0810)  loss_classifier: 0.1017 (0.0981)  lr: 0.000000  time: 0.6509  data: 0.0074  max mem: 4978
Epoch: [16]  [ 350/2375]  eta: 0:21:52  loss_objectness: 0.0032 (0.0098)  loss: 0.4173 (0.4474)  loss_rpn_box_reg: 0.0104 (0.0142)  loss_mask: 0.2313 (0.2449)  loss_box_reg: 0.0776 (0.0807)  loss_classifier: 0.0904 (0.0978)  lr: 0.000000  time: 0.6548  data: 0.0074  max mem: 4978
Epoch: [16]  [ 360/2375]  eta: 0:21:46  loss_objectness: 0.0030 (0.0096)  loss: 0.4128 (0.4467)  loss_rpn_box_reg: 0.0130 (0.0142)  loss_mask: 0.2299 (0.2444)  loss_box_reg: 0.0648 (0.0806)  loss_classifier: 0.0904 (0.0978)  lr: 0.000000  time: 0.6555  data: 0.0073  max mem: 4978
Epoch: [16]  [ 370/2375]  eta: 0:21:39  loss_objectness: 0.0033 (0.0095)  loss: 0.4266 (0.4470)  loss_rpn_box_reg: 0.0130 (0.0142)  loss_mask: 0.2337 (0.2448)  loss_box_reg: 0.0704 (0.0806)  loss_classifier: 0.0971 (0.0978)  lr: 0.000000  time: 0.6492  data: 0.0072  max mem: 4978
Epoch: [16]  [ 380/2375]  eta: 0:21:33  loss_objectness: 0.0033 (0.0094)  loss: 0.4259 (0.4456)  loss_rpn_box_reg: 0.0113 (0.0141)  loss_mask: 0.2465 (0.2445)  loss_box_reg: 0.0704 (0.0803)  loss_classifier: 0.0897 (0.0972)  lr: 0.000000  time: 0.6432  data: 0.0071  max mem: 4978
Epoch: [16]  [ 390/2375]  eta: 0:21:26  loss_objectness: 0.0039 (0.0098)  loss: 0.4601 (0.4473)  loss_rpn_box_reg: 0.0114 (0.0144)  loss_mask: 0.2473 (0.2452)  loss_box_reg: 0.0784 (0.0806)  loss_classifier: 0.0901 (0.0974)  lr: 0.000000  time: 0.6491  data: 0.0071  max mem: 4978
Epoch: [16]  [ 400/2375]  eta: 0:21:20  loss_objectness: 0.0035 (0.0098)  loss: 0.4649 (0.4476)  loss_rpn_box_reg: 0.0121 (0.0143)  loss_mask: 0.2463 (0.2452)  loss_box_reg: 0.0795 (0.0806)  loss_classifier: 0.0964 (0.0977)  lr: 0.000000  time: 0.6492  data: 0.0071  max mem: 4978
Epoch: [16]  [ 410/2375]  eta: 0:21:13  loss_objectness: 0.0041 (0.0098)  loss: 0.4161 (0.4478)  loss_rpn_box_reg: 0.0112 (0.0143)  loss_mask: 0.2410 (0.2456)  loss_box_reg: 0.0730 (0.0806)  loss_classifier: 0.0972 (0.0976)  lr: 0.000000  time: 0.6460  data: 0.0072  max mem: 4978
Epoch: [16]  [ 420/2375]  eta: 0:21:07  loss_objectness: 0.0075 (0.0097)  loss: 0.4506 (0.4484)  loss_rpn_box_reg: 0.0134 (0.0143)  loss_mask: 0.2415 (0.2459)  loss_box_reg: 0.0762 (0.0808)  loss_classifier: 0.0972 (0.0977)  lr: 0.000000  time: 0.6505  data: 0.0071  max mem: 4978
Epoch: [16]  [ 430/2375]  eta: 0:21:01  loss_objectness: 0.0050 (0.0097)  loss: 0.4413 (0.4481)  loss_rpn_box_reg: 0.0112 (0.0143)  loss_mask: 0.2361 (0.2457)  loss_box_reg: 0.0742 (0.0806)  loss_classifier: 0.0856 (0.0978)  lr: 0.000000  time: 0.6514  data: 0.0071  max mem: 4978
some issue here. skipping.
Epoch: [17]  [   0/2375]  eta: 0:35:37  loss_objectness: 0.0036 (0.0036)  loss: 0.3479 (0.3479)  loss_rpn_box_reg: 0.0131 (0.0131)  loss_mask: 0.2323 (0.2323)  loss_box_reg: 0.0455 (0.0455)  loss_classifier: 0.0533 (0.0533)  lr: 0.000000  time: 0.9000  data: 0.2476  max mem: 4978
Epoch: [17]  [  10/2375]  eta: 0:26:36  loss_objectness: 0.0036 (0.0057)  loss: 0.4440 (0.4264)  loss_rpn_box_reg: 0.0139 (0.0183)  loss_mask: 0.2323 (0.2360)  loss_box_reg: 0.0744 (0.0779)  loss_classifier: 0.0894 (0.0885)  lr: 0.000000  time: 0.6749  data: 0.0293  max mem: 4978
Epoch: [17]  [  20/2375]  eta: 0:25:56  loss_objectness: 0.0039 (0.0064)  loss: 0.4263 (0.4054)  loss_rpn_box_reg: 0.0119 (0.0145)  loss_mask: 0.2231 (0.2240)  loss_box_reg: 0.0713 (0.0714)  loss_classifier: 0.0867 (0.0890)  lr: 0.000000  time: 0.6488  data: 0.0073  max mem: 4978
Epoch: [17]  [  30/2375]  eta: 0:25:34  loss_objectness: 0.0063 (0.0084)  loss: 0.3892 (0.4258)  loss_rpn_box_reg: 0.0118 (0.0144)  loss_mask: 0.2297 (0.2354)  loss_box_reg: 0.0713 (0.0738)  loss_classifier: 0.0886 (0.0938)  lr: 0.000000  time: 0.6432  data: 0.0072  max mem: 4978
Epoch: [17]  [  40/2375]  eta: 0:25:29  loss_objectness: 0.0063 (0.0082)  loss: 0.4508 (0.4235)  loss_rpn_box_reg: 0.0138 (0.0141)  loss_mask: 0.2349 (0.2331)  loss_box_reg: 0.0826 (0.0747)  loss_classifier: 0.0976 (0.0933)  lr: 0.000000  time: 0.6493  data: 0.0072  max mem: 4978
Epoch: [17]  [  50/2375]  eta: 0:25:15  loss_objectness: 0.0036 (0.0082)  loss: 0.4508 (0.4272)  loss_rpn_box_reg: 0.0127 (0.0139)  loss_mask: 0.2386 (0.2371)  loss_box_reg: 0.0795 (0.0757)  loss_classifier: 0.0871 (0.0922)  lr: 0.000000  time: 0.6477  data: 0.0072  max mem: 4978
Epoch: [17]  [  60/2375]  eta: 0:25:07  loss_objectness: 0.0036 (0.0078)  loss: 0.4396 (0.4241)  loss_rpn_box_reg: 0.0108 (0.0135)  loss_mask: 0.2403 (0.2365)  loss_box_reg: 0.0683 (0.0752)  loss_classifier: 0.0871 (0.0912)  lr: 0.000000  time: 0.6427  data: 0.0071  max mem: 4978
Epoch: [17]  [  70/2375]  eta: 0:24:55  loss_objectness: 0.0039 (0.0074)  loss: 0.4290 (0.4255)  loss_rpn_box_reg: 0.0091 (0.0130)  loss_mask: 0.2307 (0.2385)  loss_box_reg: 0.0706 (0.0755)  loss_classifier: 0.0831 (0.0911)  lr: 0.000000  time: 0.6418  data: 0.0071  max mem: 4978
Epoch: [17]  [  80/2375]  eta: 0:24:50  loss_objectness: 0.0042 (0.0075)  loss: 0.4445 (0.4274)  loss_rpn_box_reg: 0.0091 (0.0131)  loss_mask: 0.2493 (0.2397)  loss_box_reg: 0.0775 (0.0757)  loss_classifier: 0.0907 (0.0914)  lr: 0.000000  time: 0.6448  data: 0.0071  max mem: 4978
Epoch: [17]  [  90/2375]  eta: 0:24:42  loss_objectness: 0.0033 (0.0072)  loss: 0.4367 (0.4278)  loss_rpn_box_reg: 0.0123 (0.0131)  loss_mask: 0.2445 (0.2398)  loss_box_reg: 0.0761 (0.0753)  loss_classifier: 0.0948 (0.0924)  lr: 0.000000  time: 0.6475  data: 0.0071  max mem: 4978
Epoch: [17]  [ 100/2375]  eta: 0:24:33  loss_objectness: 0.0023 (0.0076)  loss: 0.4042 (0.4268)  loss_rpn_box_reg: 0.0118 (0.0130)  loss_mask: 0.2193 (0.2391)  loss_box_reg: 0.0661 (0.0748)  loss_classifier: 0.0948 (0.0923)  lr: 0.000000  time: 0.6402  data: 0.0071  max mem: 4978
Epoch: [17]  [ 110/2375]  eta: 0:24:25  loss_objectness: 0.0046 (0.0078)  loss: 0.4042 (0.4303)  loss_rpn_box_reg: 0.0120 (0.0129)  loss_mask: 0.2251 (0.2396)  loss_box_reg: 0.0733 (0.0762)  loss_classifier: 0.0968 (0.0938)  lr: 0.000000  time: 0.6406  data: 0.0071  max mem: 4978
Epoch: [17]  [ 120/2375]  eta: 0:24:19  loss_objectness: 0.0044 (0.0077)  loss: 0.4122 (0.4281)  loss_rpn_box_reg: 0.0121 (0.0128)  loss_mask: 0.2301 (0.2386)  loss_box_reg: 0.0733 (0.0755)  loss_classifier: 0.0949 (0.0935)  lr: 0.000000  time: 0.6456  data: 0.0071  max mem: 4978
some issue here. skipping.
Epoch: [18]  [   0/2375]  eta: 0:37:11  loss_objectness: 0.0057 (0.0057)  loss: 0.5582 (0.5582)  loss_rpn_box_reg: 0.0123 (0.0123)  loss_mask: 0.3189 (0.3189)  loss_box_reg: 0.0758 (0.0758)  loss_classifier: 0.1456 (0.1456)  lr: 0.000000  time: 0.9394  data: 0.2888  max mem: 4978
Epoch: [18]  [  10/2375]  eta: 0:26:26  loss_objectness: 0.0042 (0.0089)  loss: 0.4385 (0.4527)  loss_rpn_box_reg: 0.0123 (0.0125)  loss_mask: 0.2510 (0.2532)  loss_box_reg: 0.0617 (0.0740)  loss_classifier: 0.0945 (0.1042)  lr: 0.000000  time: 0.6706  data: 0.0331  max mem: 4978
some issue here. skipping.
Epoch: [19]  [   0/2375]  eta: 0:35:12  loss_objectness: 0.0010 (0.0010)  loss: 0.2548 (0.2548)  loss_rpn_box_reg: 0.0046 (0.0046)  loss_mask: 0.1736 (0.1736)  loss_box_reg: 0.0283 (0.0283)  loss_classifier: 0.0473 (0.0473)  lr: 0.000000  time: 0.8895  data: 0.2408  max mem: 4978
Epoch: [19]  [  10/2375]  eta: 0:26:14  loss_objectness: 0.0040 (0.0252)  loss: 0.3934 (0.4363)  loss_rpn_box_reg: 0.0109 (0.0174)  loss_mask: 0.2282 (0.2299)  loss_box_reg: 0.0621 (0.0698)  loss_classifier: 0.0822 (0.0939)  lr: 0.000000  time: 0.6658  data: 0.0290  max mem: 4978
Epoch: [19]  [  20/2375]  eta: 0:25:46  loss_objectness: 0.0040 (0.0155)  loss: 0.3949 (0.4334)  loss_rpn_box_reg: 0.0109 (0.0151)  loss_mask: 0.2394 (0.2405)  loss_box_reg: 0.0692 (0.0722)  loss_classifier: 0.0830 (0.0901)  lr: 0.000000  time: 0.6450  data: 0.0074  max mem: 4978
Epoch: [19]  [  30/2375]  eta: 0:25:42  loss_objectness: 0.0036 (0.0124)  loss: 0.3866 (0.4249)  loss_rpn_box_reg: 0.0099 (0.0139)  loss_mask: 0.2331 (0.2371)  loss_box_reg: 0.0692 (0.0718)  loss_classifier: 0.0826 (0.0897)  lr: 0.000000  time: 0.6533  data: 0.0071  max mem: 4978
Epoch: [19]  [  40/2375]  eta: 0:25:30  loss_objectness: 0.0034 (0.0106)  loss: 0.3943 (0.4301)  loss_rpn_box_reg: 0.0107 (0.0142)  loss_mask: 0.2331 (0.2409)  loss_box_reg: 0.0745 (0.0742)  loss_classifier: 0.0847 (0.0902)  lr: 0.000000  time: 0.6544  data: 0.0072  max mem: 4978
Epoch: [19]  [  50/2375]  eta: 0:25:19  loss_objectness: 0.0053 (0.0097)  loss: 0.4337 (0.4343)  loss_rpn_box_reg: 0.0117 (0.0138)  loss_mask: 0.2452 (0.2408)  loss_box_reg: 0.0817 (0.0759)  loss_classifier: 0.0848 (0.0941)  lr: 0.000000  time: 0.6471  data: 0.0071  max mem: 4978
Epoch: [19]  [  60/2375]  eta: 0:25:11  loss_objectness: 0.0053 (0.0093)  loss: 0.4254 (0.4324)  loss_rpn_box_reg: 0.0114 (0.0134)  loss_mask: 0.2321 (0.2387)  loss_box_reg: 0.0723 (0.0760)  loss_classifier: 0.0844 (0.0950)  lr: 0.000000  time: 0.6476  data: 0.0071  max mem: 4978
Epoch: [19]  [  70/2375]  eta: 0:25:06  loss_objectness: 0.0028 (0.0085)  loss: 0.3701 (0.4289)  loss_rpn_box_reg: 0.0112 (0.0135)  loss_mask: 0.2206 (0.2378)  loss_box_reg: 0.0746 (0.0758)  loss_classifier: 0.0797 (0.0933)  lr: 0.000000  time: 0.6537  data: 0.0071  max mem: 4978
Epoch: [19]  [  80/2375]  eta: 0:24:57  loss_objectness: 0.0023 (0.0084)  loss: 0.3668 (0.4271)  loss_rpn_box_reg: 0.0097 (0.0129)  loss_mask: 0.2356 (0.2377)  loss_box_reg: 0.0621 (0.0754)  loss_classifier: 0.0737 (0.0928)  lr: 0.000000  time: 0.6507  data: 0.0071  max mem: 4978
Epoch: [19]  [  90/2375]  eta: 0:24:50  loss_objectness: 0.0024 (0.0079)  loss: 0.3852 (0.4223)  loss_rpn_box_reg: 0.0076 (0.0123)  loss_mask: 0.2241 (0.2361)  loss_box_reg: 0.0613 (0.0745)  loss_classifier: 0.0762 (0.0915)  lr: 0.000000  time: 0.6482  data: 0.0073  max mem: 4978
Epoch: [19]  [ 100/2375]  eta: 0:24:42  loss_objectness: 0.0029 (0.0084)  loss: 0.3977 (0.4308)  loss_rpn_box_reg: 0.0104 (0.0127)  loss_mask: 0.2160 (0.2383)  loss_box_reg: 0.0778 (0.0769)  loss_classifier: 0.0826 (0.0945)  lr: 0.000000  time: 0.6495  data: 0.0073  max mem: 4978
Epoch: [19]  [ 110/2375]  eta: 0:24:33  loss_objectness: 0.0089 (0.0097)  loss: 0.4987 (0.4404)  loss_rpn_box_reg: 0.0145 (0.0131)  loss_mask: 0.2545 (0.2420)  loss_box_reg: 0.0971 (0.0788)  loss_classifier: 0.1112 (0.0968)  lr: 0.000000  time: 0.6422  data: 0.0072  max mem: 4978
Epoch: [19]  [ 120/2375]  eta: 0:24:26  loss_objectness: 0.0038 (0.0093)  loss: 0.4957 (0.4417)  loss_rpn_box_reg: 0.0146 (0.0132)  loss_mask: 0.2554 (0.2428)  loss_box_reg: 0.0971 (0.0794)  loss_classifier: 0.1112 (0.0970)  lr: 0.000000  time: 0.6428  data: 0.0072  max mem: 4978
Epoch: [19]  [ 130/2375]  eta: 0:24:18  loss_objectness: 0.0034 (0.0091)  loss: 0.4207 (0.4390)  loss_rpn_box_reg: 0.0107 (0.0129)  loss_mask: 0.2476 (0.2428)  loss_box_reg: 0.0614 (0.0783)  loss_classifier: 0.0847 (0.0959)  lr: 0.000000  time: 0.6450  data: 0.0072  max mem: 4978
Epoch: [19]  [ 140/2375]  eta: 0:24:12  loss_objectness: 0.0024 (0.0086)  loss: 0.3945 (0.4355)  loss_rpn_box_reg: 0.0084 (0.0127)  loss_mask: 0.2206 (0.2426)  loss_box_reg: 0.0593 (0.0772)  loss_classifier: 0.0766 (0.0945)  lr: 0.000000  time: 0.6482  data: 0.0071  max mem: 4978
Epoch: [19]  [ 150/2375]  eta: 0:24:05  loss_objectness: 0.0025 (0.0086)  loss: 0.3633 (0.4358)  loss_rpn_box_reg: 0.0093 (0.0129)  loss_mask: 0.2151 (0.2417)  loss_box_reg: 0.0571 (0.0773)  loss_classifier: 0.0741 (0.0952)  lr: 0.000000  time: 0.6481  data: 0.0072  max mem: 4978
some issue here. skipping.
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 

Test on some images after training:

In [24]:
num_classes=2
state_dict = torch.load('finetuned_19.pth')
model.load_state_dict(state_dict)
# move model to the right device
model.to(device)
model.eval();
In [25]:
plot_mask_rcnn_result('egohands/DATA_IMAGES/Image9_26.jpg', threshold=0.9)
In [26]:
plot_mask_rcnn_result('egohands/DATA_IMAGES/Image10_26.jpg', threshold=0.6)

Looks good, now let's test on some images not in the dataset

In [30]:
plot_mask_rcnn_result('test_images/1.jpg', threshold=0.9)
In [36]:
plot_mask_rcnn_result('test_images/2.jpg', threshold=0.7)
In [40]:
plot_mask_rcnn_result('test_images/3.jpg', threshold=0.8)

Sometimes it detects faces as hands, but this is probably because there are no faces in the training set, hence by adding more pictures with faces in the training set would mitigate this problem.

The different hyperparameters that were tested during training were changing the optimizer from SGD to Adam. Adam did not tend to spit Nans so I stuck with it.

Then the initial learning rate was set to 0.01, which was then reduced to 0.005.

With these, I got good enough results and hence these are the final parameters I trained my model with.

In [ ]: